Created
February 10, 2025 19:49
-
-
Save ljleb/f5a1f9cf2aa67aab9fa78a630deabee2 to your computer and use it in GitHub Desktop.
SDXL Attention stats logs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Image saved as 'generated_image.png'. | |
Attention statistics per diffusion timestep: | |
Timestep 1.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0027, min=0.0000, max=0.9492 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.8486 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.9780 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=1.0000 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9380 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.6533 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9160 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.7329 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0076, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.5493 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7710 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.6133 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9927 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.6792 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0074, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.7139 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9751 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.7710 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9609 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8096 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8604 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.7578 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7148 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.8110 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.8799 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.8828 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.8179 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.9048 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9507 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.8066 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9253 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0221, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8906 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.9658 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8711 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.9243 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8550 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9507 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6821 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.8589 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.9199 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8403 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.9229 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7500 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.9180 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9805 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.8408 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9453 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.7104 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0078, min=0.0000, max=0.9956 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.4543 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9790 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.7017 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9927 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.6924 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0077, min=0.0000, max=0.9673 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.8125 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9697 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.8794 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8462 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.7695 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9707 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.7529 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9355 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0415, min=0.0000, max=0.7271 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.9658 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0464, min=0.0000, max=0.8901 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9790 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.4946 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.8022 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.4351 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9023 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.6187 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9966 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0168, min=0.0000, max=0.6401 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9590 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.6538 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8335 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.4805 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8779 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.5518 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9180 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0186, min=0.0000, max=0.6055 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.9692 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.7871 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7402 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0436, min=0.0000, max=0.8975 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.9868 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.4065 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0163, min=0.0000, max=0.4871 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8438 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0156, min=0.0000, max=0.5957 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7920 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0157, min=0.0000, max=0.4382 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.9092 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0155, min=0.0000, max=0.3562 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8706 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0169, min=0.0000, max=0.4309 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8789 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.5396 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8188 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.8350 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.9106 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.7148 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.8965 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0364, min=0.0000, max=0.9160 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9980 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0238, min=0.0000, max=0.5083 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9302 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0166, min=0.0000, max=0.5557 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9404 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0152, min=0.0000, max=0.4302 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8628 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0179, min=0.0000, max=0.5464 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.9946 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.7593 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8521 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0214, min=0.0000, max=0.7852 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8833 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.7583 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.9360 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.8330 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8926 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.7632 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.7119 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.8721 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0029, min=0.0000, max=0.8267 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.6270 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0029, min=0.0000, max=0.9458 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.5674 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0027, min=0.0000, max=0.8916 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.9995 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.8657 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.8804 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0029, min=0.0000, max=0.9775 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.4807 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.8799 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=1.0000 | |
Timestep 21.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.8936 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8623 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.9204 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=1.0000 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9404 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.6768 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9653 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.7251 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0085, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.5957 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.8813 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.7754 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.6929 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0080, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.7632 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.9526 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.8770 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9014 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.8623 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8955 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0442, min=0.0000, max=0.7798 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8579 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0433, min=0.0000, max=0.8335 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.8027 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0461, min=0.0000, max=0.9048 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7964 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.9243 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9443 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.8726 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8911 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.9702 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8911 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.9634 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8394 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.9282 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7954 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.9678 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7207 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.9028 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8643 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8110 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0272, min=0.0000, max=0.9570 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6968 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.9341 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9419 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.8613 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.9121 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.6992 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9912 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.4624 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9453 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.7407 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9946 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.6816 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9697 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.8228 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9722 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.8823 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8936 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.7480 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9297 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.7593 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9663 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.7256 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8901 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0451, min=0.0000, max=0.8828 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9487 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.4861 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9316 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.5229 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9370 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.7227 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9478 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0170, min=0.0000, max=0.6748 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.6543 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.9004 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.4998 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.9155 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.5454 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.9585 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.6431 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.9170 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.8247 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7305 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0449, min=0.0000, max=0.9165 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9771 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0186, min=0.0000, max=0.4434 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8403 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0169, min=0.0000, max=0.5415 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8408 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0158, min=0.0000, max=0.7021 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8169 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0161, min=0.0000, max=0.4829 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8799 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0158, min=0.0000, max=0.3906 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7959 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0170, min=0.0000, max=0.4814 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.5654 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8394 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.7988 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.9351 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7803 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.9316 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9995 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0241, min=0.0000, max=0.5415 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9814 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0169, min=0.0000, max=0.5718 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9722 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0157, min=0.0000, max=0.4541 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.9419 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.5767 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9985 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.8262 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8384 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.8003 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.9531 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.7944 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8604 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.8652 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9419 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.7812 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.8770 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.8784 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0028, min=0.0000, max=0.8311 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6450 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0027, min=0.0000, max=0.9194 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.5889 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9307 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.8462 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.7876 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8701 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9629 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.4717 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.8418 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=1.0000 | |
Timestep 41.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.8457 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.8589 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.9043 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.9248 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9243 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.6943 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9644 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.7236 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0088, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0239, min=0.0000, max=0.6353 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8848 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.8125 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.7070 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0082, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.7764 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.9497 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.9287 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8691 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.9185 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.9575 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0452, min=0.0000, max=0.7998 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8906 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.8481 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.7114 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0476, min=0.0000, max=0.9126 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7715 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0448, min=0.0000, max=0.9302 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9404 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.8784 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9204 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.9404 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8945 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.9702 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8525 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.9316 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8853 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9702 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7017 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.9204 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8809 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7681 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.9609 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6606 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.9390 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9209 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.8740 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8633 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.7095 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0074, min=0.0000, max=0.9849 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.4578 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9419 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.7520 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9951 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.6812 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9863 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.8311 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9878 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.8833 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9346 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.7412 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9497 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.7559 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9604 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.7271 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8394 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0446, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9351 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.5269 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9277 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.5815 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9307 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.7476 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9204 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0172, min=0.0000, max=0.6621 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9521 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.6675 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8916 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.5161 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8384 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.5518 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.9136 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.6685 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8164 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.8389 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7314 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0453, min=0.0000, max=0.9224 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9385 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.4595 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.7930 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.5874 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8125 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0161, min=0.0000, max=0.7407 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8403 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0166, min=0.0000, max=0.5166 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7710 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0160, min=0.0000, max=0.4248 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6982 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0170, min=0.0000, max=0.5376 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8818 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.5869 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7979 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.8213 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8979 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.7915 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.8115 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0423, min=0.0000, max=0.9326 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9990 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.5820 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9854 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0171, min=0.0000, max=0.5649 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9839 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0159, min=0.0000, max=0.4612 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.9370 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.5894 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9980 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8560 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.8135 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.9785 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.7974 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8325 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.8696 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9644 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.7891 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.9038 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.8813 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0028, min=0.0000, max=0.8740 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.6592 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9067 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.6001 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9346 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.8262 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.7344 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8560 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9624 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.4758 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.7080 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.5527 | |
Timestep 61.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.8491 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.8545 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.8828 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.8271 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9170 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7144 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9688 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.7368 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0089, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.6562 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8599 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.8276 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.7104 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0082, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.7803 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9585 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.9448 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9033 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.9443 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.9717 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0459, min=0.0000, max=0.8071 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.9033 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.8525 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.7266 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9233 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.7358 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0458, min=0.0000, max=0.9321 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9263 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.8799 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9238 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.9272 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9126 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.9751 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8301 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.9307 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9082 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.9795 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7910 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.9287 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8574 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7104 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.9609 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6421 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.9409 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9199 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0386, min=0.0000, max=0.8843 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8794 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.7173 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9766 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.4524 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9390 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.7676 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9941 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.6855 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9902 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8359 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9922 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0358, min=0.0000, max=0.8843 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9395 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7334 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9756 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.7529 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9492 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7212 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8862 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.8667 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9360 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.5630 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9277 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.6025 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9307 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.7485 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8716 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0174, min=0.0000, max=0.6782 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9395 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.7124 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8301 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0176, min=0.0000, max=0.5273 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7832 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.5610 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8359 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.6919 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7827 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.8511 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7056 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0456, min=0.0000, max=0.9248 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8286 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0191, min=0.0000, max=0.4617 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7295 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.5967 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.7603 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0164, min=0.0000, max=0.7656 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8433 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0169, min=0.0000, max=0.5386 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7056 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0163, min=0.0000, max=0.4475 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6699 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0171, min=0.0000, max=0.6226 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8408 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.5864 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7876 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.8516 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8696 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.7949 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6812 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0436, min=0.0000, max=0.9277 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9946 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.5918 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9858 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0173, min=0.0000, max=0.5469 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9849 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0161, min=0.0000, max=0.4624 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.9121 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0191, min=0.0000, max=0.5938 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9976 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.8677 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8848 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0236, min=0.0000, max=0.8179 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.9746 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.7979 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8989 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.8691 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9819 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.8062 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.9204 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.8818 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0027, min=0.0000, max=0.8950 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.6699 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.8706 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.6011 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9297 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.8110 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.7383 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8423 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9565 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.4785 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.7480 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.5430 | |
Timestep 81.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.8389 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8511 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.8706 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.8418 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9097 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.7212 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9653 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.7339 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0090, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.6660 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8364 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.8315 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.7075 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0082, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.7827 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9629 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.9507 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8979 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.9595 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.9741 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0464, min=0.0000, max=0.8105 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.9019 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0455, min=0.0000, max=0.8535 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7388 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0494, min=0.0000, max=0.9360 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6904 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0465, min=0.0000, max=0.9336 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9258 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8970 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9243 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.9233 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9233 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.9800 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7837 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.9316 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.9043 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.9824 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8320 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.9487 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8208 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6489 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.9590 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6279 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.9434 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9111 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0391, min=0.0000, max=0.8901 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8789 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.7241 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9795 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.4619 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9351 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.7725 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9927 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.6899 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9907 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.8394 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9932 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.8892 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9448 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7324 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9727 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.7520 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9355 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7139 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8975 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0440, min=0.0000, max=0.8579 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9380 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.5757 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9248 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.6206 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9346 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.7490 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8247 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.7163 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9355 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.7432 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8325 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0177, min=0.0000, max=0.5278 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7847 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.5679 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8516 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.7153 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7656 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.8589 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7524 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0458, min=0.0000, max=0.9272 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8071 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.5039 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6587 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.5938 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7402 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0167, min=0.0000, max=0.7822 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8301 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0172, min=0.0000, max=0.5601 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6968 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0165, min=0.0000, max=0.4653 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6636 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0171, min=0.0000, max=0.6851 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7646 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.5747 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7749 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.8687 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8569 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.8052 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6831 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0447, min=0.0000, max=0.9199 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9570 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.5962 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9854 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0174, min=0.0000, max=0.5396 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9897 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0163, min=0.0000, max=0.4629 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8242 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.5947 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9790 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8716 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0239, min=0.0000, max=0.8188 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8916 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.8135 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8872 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.8672 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9624 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.8164 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9204 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.8818 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0027, min=0.0000, max=0.9131 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.6821 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.8198 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.6055 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9219 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8013 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.7607 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8315 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9380 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.4805 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.7554 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.6348 | |
Timestep 101.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8633 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.8477 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.8770 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.8550 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9155 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.7173 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9653 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.7305 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0091, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.6685 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8374 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.8311 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.7583 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0081, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.7866 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9629 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.9541 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8833 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0375, min=0.0000, max=0.9692 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.9766 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0469, min=0.0000, max=0.8135 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.8721 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0459, min=0.0000, max=0.8579 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7207 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9419 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6440 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0471, min=0.0000, max=0.9351 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9229 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.9175 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9214 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.9253 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9302 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.9834 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7568 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.9370 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8809 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.9834 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8496 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.9653 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7832 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5552 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.9688 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6509 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.9448 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9023 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.8940 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.8838 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.7271 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0070, min=0.0000, max=0.9771 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.4763 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9224 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.7739 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9902 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.6924 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0070, min=0.0000, max=0.9878 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.8403 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9922 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.8896 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9419 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7231 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9678 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.7505 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9487 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7075 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8960 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0439, min=0.0000, max=0.8574 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9429 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.5684 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9219 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0183, min=0.0000, max=0.6255 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9434 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0190, min=0.0000, max=0.7764 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.7285 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0177, min=0.0000, max=0.7383 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9375 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.7622 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8237 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0178, min=0.0000, max=0.5249 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7441 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.5684 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8618 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.7339 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7339 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.8647 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8110 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0461, min=0.0000, max=0.9302 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8110 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.5366 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6230 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.5859 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7432 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0170, min=0.0000, max=0.7939 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8052 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.5962 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6616 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0166, min=0.0000, max=0.4800 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6704 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0172, min=0.0000, max=0.7305 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7026 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0214, min=0.0000, max=0.5542 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7505 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.8804 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8467 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.8145 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6714 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0455, min=0.0000, max=0.9307 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9619 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.6055 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9839 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.5537 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9927 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0165, min=0.0000, max=0.4805 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7700 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.5952 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7178 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8550 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.8223 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7231 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.8433 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8530 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.8657 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7939 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0387, min=0.0000, max=0.8223 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9238 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.8818 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9272 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.6880 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.7485 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.6313 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9121 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.7949 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.7876 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8267 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9048 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.4805 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.7510 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.5322 | |
Timestep 121.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8843 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.8481 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.8857 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.8657 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9175 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.7061 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9717 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.7310 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0092, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.6675 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8389 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.8276 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.8188 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0081, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.7930 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9590 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8696 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.9795 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0473, min=0.0000, max=0.8174 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.8101 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0463, min=0.0000, max=0.8608 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.6807 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.9443 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6401 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0477, min=0.0000, max=0.9365 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9199 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.9287 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9175 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.9297 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9326 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7202 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.9438 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8345 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.9839 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8594 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.9727 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7710 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5444 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.9805 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6919 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9453 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8950 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.8970 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9048 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.7275 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0069, min=0.0000, max=0.9717 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.4883 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9194 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0251, min=0.0000, max=0.7754 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9873 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.6934 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0069, min=0.0000, max=0.9800 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.8364 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9907 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.8867 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9087 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7192 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9653 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.7485 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9585 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7021 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8926 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0437, min=0.0000, max=0.8574 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9448 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.5483 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.9155 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.6255 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9458 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0191, min=0.0000, max=0.8105 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.6924 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0178, min=0.0000, max=0.7573 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9351 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.7754 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7832 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0179, min=0.0000, max=0.5190 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7222 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.5664 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8691 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.7476 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7358 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8687 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7993 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0463, min=0.0000, max=0.9370 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8081 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.5635 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.5918 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.5718 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.7446 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0173, min=0.0000, max=0.8047 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7705 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0178, min=0.0000, max=0.6489 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6387 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0168, min=0.0000, max=0.4988 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6548 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0174, min=0.0000, max=0.7710 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6377 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0217, min=0.0000, max=0.5479 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7168 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.8872 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8638 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.8218 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6748 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0463, min=0.0000, max=0.9385 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9648 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.6167 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9819 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0177, min=0.0000, max=0.5625 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9941 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0166, min=0.0000, max=0.4890 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7734 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.6050 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7329 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.8853 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8301 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8184 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7202 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.8633 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8198 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.8706 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.7065 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.8320 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9214 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.8838 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0026, min=0.0000, max=0.9385 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.6914 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.6670 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6812 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.8916 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.7954 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.8076 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8198 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8936 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.4829 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.7485 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.5435 | |
Timestep 141.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8926 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0251, min=0.0000, max=0.8496 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.8916 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8735 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9165 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.6909 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9751 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.7314 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0092, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.6646 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8276 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.8252 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.8604 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0080, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8018 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9473 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0350, min=0.0000, max=0.9551 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8579 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.9775 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.9829 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0476, min=0.0000, max=0.8208 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7715 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0466, min=0.0000, max=0.8623 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.6226 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0510, min=0.0000, max=0.9448 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.6348 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0482, min=0.0000, max=0.9375 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9116 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.9355 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9126 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.9321 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9326 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6768 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.9487 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7593 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.9839 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.8594 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.9722 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7368 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5288 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7114 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9458 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8887 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.8989 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9146 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.7261 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9653 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.4944 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9185 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0251, min=0.0000, max=0.7769 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9829 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.6924 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9604 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.8276 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9888 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.8813 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8945 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7163 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9614 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.7456 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9648 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7002 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8892 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0436, min=0.0000, max=0.8560 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9414 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.5215 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.9048 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.6270 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9399 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.8438 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.6821 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.7773 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9341 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0208, min=0.0000, max=0.7861 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7163 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.5103 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7427 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.5679 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8696 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.7578 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7139 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.8701 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8291 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0465, min=0.0000, max=0.9390 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.8071 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.5864 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.5645 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.5928 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.7393 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0176, min=0.0000, max=0.8159 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7305 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.6938 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5845 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0171, min=0.0000, max=0.5278 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6221 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.8105 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6377 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.5498 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6733 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8975 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0353, min=0.0000, max=0.8330 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6831 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0470, min=0.0000, max=0.9453 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9658 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.6196 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9805 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0178, min=0.0000, max=0.5674 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9937 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0168, min=0.0000, max=0.4951 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7646 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.6201 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7432 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.8892 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8062 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.8223 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6938 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.8774 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7759 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.8750 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6948 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.8564 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9155 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.8867 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9463 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.6992 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.6143 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.7158 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.8613 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.8022 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8242 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8208 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8755 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.4863 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.7358 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.5522 | |
Timestep 161.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8916 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.8516 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.8950 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8799 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9126 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.6743 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9766 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.7373 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0092, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.6606 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8091 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.8286 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0074, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8877 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0080, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.8115 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9233 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0353, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8501 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9795 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.9858 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0479, min=0.0000, max=0.8237 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7656 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0470, min=0.0000, max=0.8623 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5742 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0514, min=0.0000, max=0.9487 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.6250 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9399 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9028 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.9404 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9058 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.9277 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9312 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6343 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.9526 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6597 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.9844 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.8462 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.9688 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7212 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4944 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7227 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9463 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8838 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0407, min=0.0000, max=0.9019 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9189 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.7241 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9585 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.4985 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.9180 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.7798 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9819 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.6904 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9414 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.8154 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9858 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.8755 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8198 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7188 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9556 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.7427 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9668 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.7012 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8843 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.9321 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.4944 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.8901 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0186, min=0.0000, max=0.6323 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9219 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.8232 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.6841 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.7949 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9346 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.7939 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6538 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.4990 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7261 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.5693 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8647 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.7637 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6895 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.8687 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8438 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0467, min=0.0000, max=0.9404 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8042 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.6055 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.5400 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.6079 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7349 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0179, min=0.0000, max=0.8262 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6846 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.7280 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5498 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0173, min=0.0000, max=0.5581 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5645 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0177, min=0.0000, max=0.8516 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6440 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.5513 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6274 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.9062 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8833 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.8560 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6885 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0476, min=0.0000, max=0.9517 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9648 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.6309 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9800 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0179, min=0.0000, max=0.5698 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9932 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0169, min=0.0000, max=0.4988 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7524 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.6328 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7393 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.8936 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7793 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.8311 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6533 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.8877 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7627 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.8789 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6816 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0391, min=0.0000, max=0.8726 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9082 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.8911 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9512 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7026 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.5962 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.7393 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.8364 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.8101 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8340 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8218 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8467 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.4895 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.7393 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.5552 | |
Timestep 181.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8828 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.8516 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.8955 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8843 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9082 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.6724 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9771 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.7383 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0092, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.6567 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7842 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8301 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0074, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.9062 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0079, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.8213 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9194 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8413 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0393, min=0.0000, max=0.9810 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0482, min=0.0000, max=0.8252 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7627 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0472, min=0.0000, max=0.8618 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5977 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0518, min=0.0000, max=0.9517 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.6719 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0489, min=0.0000, max=0.9434 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8901 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.9438 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8945 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.9287 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9268 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6006 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.9556 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5625 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.8237 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.9639 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7080 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4990 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7778 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9473 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8770 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.9043 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9199 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.7217 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9507 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.5005 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.9170 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.7856 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9824 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.6860 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9253 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.8013 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9810 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0364, min=0.0000, max=0.8701 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7412 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7202 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9438 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.7402 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9668 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7075 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.8760 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.8579 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.9204 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.4734 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8716 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0187, min=0.0000, max=0.6387 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8872 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.7627 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.6772 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.8091 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9341 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.8003 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6641 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.4871 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7158 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.5684 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8579 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.7666 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6562 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.8647 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8452 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0469, min=0.0000, max=0.9399 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8042 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.6226 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.5142 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.6177 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7319 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.8359 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6333 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.7559 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5352 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0176, min=0.0000, max=0.5806 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5278 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0179, min=0.0000, max=0.8594 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6440 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.5620 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5776 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9116 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8843 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.8740 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6865 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0481, min=0.0000, max=0.9570 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0070, min=0.0000, max=0.9614 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.6021 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9790 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.5962 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9922 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0171, min=0.0000, max=0.5049 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7534 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0207, min=0.0000, max=0.6426 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6963 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.8970 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7568 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.8389 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6509 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7500 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.6753 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.8804 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9009 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.8950 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9536 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.7070 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.5752 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7544 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.8364 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.8179 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8394 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8179 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8086 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.4927 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.7373 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.5537 | |
Timestep 201.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.8633 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.8511 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8926 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8887 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.9053 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.6719 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.9766 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.7363 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0092, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.6582 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.7559 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.8408 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.9189 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0078, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.8301 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9219 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8315 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.9819 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.9902 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0484, min=0.0000, max=0.8252 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.8076 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0475, min=0.0000, max=0.8604 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6157 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0521, min=0.0000, max=0.9536 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6143 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.9468 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8721 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.9473 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8730 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.9355 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9175 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6030 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.9585 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5913 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.9854 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7939 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.9624 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7090 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4990 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.9917 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8159 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.9478 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8735 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0415, min=0.0000, max=0.9062 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9175 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.7188 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9507 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.5361 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9175 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.7910 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9810 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.6812 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9062 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.7979 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9731 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0366, min=0.0000, max=0.8657 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7246 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.7192 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9199 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.7432 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9653 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7119 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.8594 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.8628 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9106 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.4719 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8652 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.6440 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8589 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.7144 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.6631 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0187, min=0.0000, max=0.8228 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9297 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.8027 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6938 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.4783 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6904 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.5659 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8477 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0230, min=0.0000, max=0.7671 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6196 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.8589 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8345 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0471, min=0.0000, max=0.9370 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8037 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.6387 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.5078 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.6221 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7344 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.8467 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5996 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5591 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0178, min=0.0000, max=0.5981 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5107 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.8691 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6411 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.5869 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5391 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.9165 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8779 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0371, min=0.0000, max=0.8892 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6865 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9609 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0070, min=0.0000, max=0.9561 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.5654 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9766 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.6084 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9893 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0172, min=0.0000, max=0.5210 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7568 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.6567 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7231 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.8999 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7393 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.8438 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6401 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.9009 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7295 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8853 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.6685 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0393, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.8955 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.8979 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0025, min=0.0000, max=0.9536 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7085 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.5488 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7656 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.8218 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.8218 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8433 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8081 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.7861 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.4954 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.7246 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.5483 | |
Timestep 221.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.8330 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.8496 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8862 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8916 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.9004 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.6792 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.9751 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.7305 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0092, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.6641 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.7241 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.8491 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.9272 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0077, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.8374 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9204 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.9536 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8188 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0401, min=0.0000, max=0.9834 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.9902 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.8242 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.8633 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0477, min=0.0000, max=0.8594 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6201 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0524, min=0.0000, max=0.9541 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6128 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.9492 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8823 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.9497 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8550 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.9409 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9028 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.9897 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6040 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.9609 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6562 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.9858 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7505 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.9629 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7397 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4912 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.9927 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.8276 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0402, min=0.0000, max=0.9482 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8682 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0418, min=0.0000, max=0.9067 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9116 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.7163 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9492 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.5713 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.9150 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.7959 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9766 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.6851 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9097 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.7974 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9653 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0367, min=0.0000, max=0.8691 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.7036 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.7158 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8701 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0358, min=0.0000, max=0.7446 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.9629 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7124 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.8296 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.8633 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9028 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.4851 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8540 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0190, min=0.0000, max=0.6440 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8550 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.6738 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.6909 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.8267 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9199 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0218, min=0.0000, max=0.8066 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7100 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0186, min=0.0000, max=0.4583 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6431 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.5635 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8364 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0234, min=0.0000, max=0.7656 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5972 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.8540 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8110 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0474, min=0.0000, max=0.9360 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8057 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.6553 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.5127 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.6196 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7354 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.8574 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5415 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.7871 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5576 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.6113 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5044 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.8770 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6367 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0230, min=0.0000, max=0.6094 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5361 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.9214 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8643 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0377, min=0.0000, max=0.9028 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6865 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.9644 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0069, min=0.0000, max=0.9473 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.5664 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9746 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0183, min=0.0000, max=0.6211 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9844 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0173, min=0.0000, max=0.5444 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7554 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.6909 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7539 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.9023 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7246 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.8506 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.5986 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.9053 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7012 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8872 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.6753 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0394, min=0.0000, max=0.8804 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8911 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.8999 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9541 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7056 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.5527 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7710 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.8169 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.8228 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8408 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7930 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.7803 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.4983 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.7007 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.5488 | |
Timestep 241.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.8145 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8472 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.8774 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8936 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.9038 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.6860 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.9727 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.7217 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0091, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.6772 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.7046 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.8530 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9336 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0076, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.8423 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9165 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0364, min=0.0000, max=0.9521 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8008 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.9844 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0489, min=0.0000, max=0.8228 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.8940 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0479, min=0.0000, max=0.8579 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6162 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0526, min=0.0000, max=0.9541 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6392 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0497, min=0.0000, max=0.9517 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8916 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.9507 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8389 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0272, min=0.0000, max=0.9453 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8877 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.9897 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6309 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.9629 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7100 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0353, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7046 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.9609 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7534 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4873 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.8096 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.9487 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8672 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0421, min=0.0000, max=0.9072 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9058 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.7124 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9443 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.5996 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9146 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.8003 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9688 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.6880 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9189 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.7964 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.9692 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0367, min=0.0000, max=0.8750 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6938 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.7104 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8447 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.7451 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.9590 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7095 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7734 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.8628 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8979 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.5664 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.8491 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.6392 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8521 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.6470 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.6958 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.8315 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.9019 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0220, min=0.0000, max=0.8115 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7192 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.4583 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6084 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.5601 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8193 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.7720 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5947 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.8560 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7749 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0476, min=0.0000, max=0.9375 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8022 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.6680 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.5166 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.6147 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.8647 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5283 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.7925 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5889 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.6221 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5005 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0187, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6274 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.6289 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5303 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.9346 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8428 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0382, min=0.0000, max=0.9136 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6821 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.9663 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0069, min=0.0000, max=0.9326 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.5659 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9722 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.6294 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9751 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.5620 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7412 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.7168 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7734 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.9048 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7139 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.8540 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.5396 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.9092 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6523 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.8887 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.6885 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.8794 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8862 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.9004 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9536 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7012 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.5630 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7744 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.8164 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.8232 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8345 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.7725 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.7759 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.5010 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.6753 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.5576 | |
Timestep 261.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.8438 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.8438 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.8643 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8950 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.9136 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.6919 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.9688 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.7070 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0091, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.6943 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6973 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.8535 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.9385 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.8438 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9087 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0367, min=0.0000, max=0.9502 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7905 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0407, min=0.0000, max=0.9858 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.8208 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.9111 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0481, min=0.0000, max=0.8569 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5996 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0528, min=0.0000, max=0.9536 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6685 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0498, min=0.0000, max=0.9531 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8975 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.9526 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8232 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.9492 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8721 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.9897 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6831 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.9639 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7500 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6387 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.9614 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7554 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4751 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7363 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9482 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8638 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0424, min=0.0000, max=0.9067 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8975 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.7070 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9268 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.6235 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9116 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8110 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.9541 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.6909 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9185 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.7993 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.9604 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.8809 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.6865 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.7051 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8408 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.7456 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.9531 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7036 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6748 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.8604 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8901 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.6426 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8423 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.6475 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8706 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.6514 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6890 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.8345 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.8218 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7207 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.4609 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5742 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.5557 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7891 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.7759 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5508 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.8569 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7354 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0478, min=0.0000, max=0.9375 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.7939 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.6772 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.5186 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.6294 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7974 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.8706 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5039 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.7964 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6162 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0187, min=0.0000, max=0.6362 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4980 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.8838 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6196 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.6426 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5229 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.9429 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8101 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9233 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6719 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9683 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9336 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0251, min=0.0000, max=0.5742 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.6357 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9561 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0175, min=0.0000, max=0.5737 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7168 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.7358 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7808 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.9141 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7031 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.8545 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5171 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9111 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6343 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.6978 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0396, min=0.0000, max=0.8794 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.8984 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9521 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.6953 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.5625 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.7734 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.8135 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.8208 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8232 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.7441 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.7754 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.5059 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.6558 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.5679 | |
Timestep 281.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.8652 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0241, min=0.0000, max=0.8398 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.8472 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8960 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.9199 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.6978 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.9644 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6968 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0090, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.7090 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6895 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.8481 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0070, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.9414 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0074, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.8418 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8979 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.9473 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8018 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9839 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.8188 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.9209 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0483, min=0.0000, max=0.8560 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5620 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0530, min=0.0000, max=0.9541 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6846 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9009 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.9541 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8062 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.9541 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8540 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.9897 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7349 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.9648 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7729 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5591 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.9634 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7412 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.9839 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4834 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5796 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.9478 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8574 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0428, min=0.0000, max=0.9087 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8901 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.7065 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.8916 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.6421 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9072 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.8179 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.9453 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.6934 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9131 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.8022 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9224 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.8862 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.6880 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.6997 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8330 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.7466 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.9453 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7026 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6562 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.8613 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.7031 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8345 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.6577 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8818 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.6587 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.6699 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.8354 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8442 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.8281 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7134 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0190, min=0.0000, max=0.4644 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5615 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.5513 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7598 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.7783 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4919 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.8574 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6719 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0479, min=0.0000, max=0.9395 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.7778 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.6860 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.5034 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.6543 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8062 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.8735 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4895 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.7988 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6313 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0190, min=0.0000, max=0.6553 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4871 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.8843 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6069 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0238, min=0.0000, max=0.6538 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5098 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.9453 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7715 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9312 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6372 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0503, min=0.0000, max=0.9688 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9370 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.5879 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9624 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.6431 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0176, min=0.0000, max=0.5820 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7334 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.7495 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7681 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.9219 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6855 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5376 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.9121 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6602 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7085 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.8799 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8789 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.8945 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0024, min=0.0000, max=0.9541 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.6899 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.5493 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.7681 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.8086 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.8164 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.8066 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.7471 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.7754 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.5137 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.6313 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.5796 | |
Timestep 301.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.8755 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0239, min=0.0000, max=0.8340 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.8242 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.8965 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.9238 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.7026 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.9580 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6865 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0090, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.7207 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6753 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.8511 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0069, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.9434 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.8354 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8823 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.9453 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7983 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.9800 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.8188 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.9263 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0484, min=0.0000, max=0.8574 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5166 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0531, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.6909 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0501, min=0.0000, max=0.9551 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9014 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7891 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.9580 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8442 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9897 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7695 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.9653 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7700 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4565 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.9644 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7207 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.9844 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5200 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0375, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5493 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0418, min=0.0000, max=0.9473 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8477 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0431, min=0.0000, max=0.9087 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8818 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.7090 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.8740 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.6562 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9019 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.8223 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9419 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.6958 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.8979 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.8047 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.9302 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.8896 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7051 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.6943 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8159 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.7485 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.9355 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7021 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6343 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0433, min=0.0000, max=0.8638 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.8726 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.7485 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8242 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.6636 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8701 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.6562 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.6118 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.8364 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8071 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0228, min=0.0000, max=0.8286 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6958 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.4780 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6221 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.5469 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7373 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.7837 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4761 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.8584 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6001 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0481, min=0.0000, max=0.9404 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.7598 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0217, min=0.0000, max=0.6943 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.4907 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.6763 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8052 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.8740 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4941 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.8003 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6265 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.6787 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4829 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6011 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.6655 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4978 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.9478 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7188 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.9375 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5625 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9692 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9448 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.5977 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9570 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0186, min=0.0000, max=0.6509 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0177, min=0.0000, max=0.5898 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7388 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.7622 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7324 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.9272 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6592 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.8599 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5503 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.9116 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6611 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7104 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8784 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.8892 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9561 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.6851 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.5273 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.7588 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8018 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.8130 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.7808 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.7524 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.7749 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.5200 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.6309 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.5913 | |
Timestep 321.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.8784 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0238, min=0.0000, max=0.8345 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.7983 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.8955 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.9229 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.7056 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.9507 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6743 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0089, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.7310 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6743 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.8535 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.9443 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.8237 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8701 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0374, min=0.0000, max=0.9458 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7852 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0416, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.9736 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.8208 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.9268 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.8608 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5308 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0533, min=0.0000, max=0.9551 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.6821 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.9551 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9019 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.9590 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7739 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.9595 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8481 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7754 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.9658 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7393 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3564 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9648 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7002 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5029 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5181 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0422, min=0.0000, max=0.9497 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8369 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.9062 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8740 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.7188 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.8735 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.6685 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8955 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.8252 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.9438 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.6987 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.8843 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.8071 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9229 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.8921 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7173 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.6885 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7959 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.7524 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.9229 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7007 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6382 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0433, min=0.0000, max=0.8647 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8633 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.7798 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8154 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.6675 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8481 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.6592 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.6484 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.8350 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7534 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0230, min=0.0000, max=0.8247 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6816 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.4910 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6987 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0207, min=0.0000, max=0.5420 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6582 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.7969 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4670 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8589 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5635 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0483, min=0.0000, max=0.9414 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.7383 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.7026 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.4980 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0228, min=0.0000, max=0.6958 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7964 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.8721 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4961 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.7983 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6089 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.7012 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4888 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.8818 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5898 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.6807 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4790 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.9502 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6504 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0401, min=0.0000, max=0.9414 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4390 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0510, min=0.0000, max=0.9688 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9561 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.6064 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9463 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0187, min=0.0000, max=0.6582 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8730 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0178, min=0.0000, max=0.5938 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7358 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0217, min=0.0000, max=0.7695 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6509 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.9297 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6167 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.8677 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5527 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.9102 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6411 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7114 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0398, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8789 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0494, min=0.0000, max=0.8872 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9526 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.6943 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.5166 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.7466 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.7920 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.8037 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.7471 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.7568 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.7651 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.5254 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.6235 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.6025 | |
Timestep 341.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.8745 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.8350 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.7700 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.8940 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.9165 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.7080 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.9424 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6753 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0089, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.7388 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6729 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.8555 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.9434 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.8066 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8447 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0376, min=0.0000, max=0.9463 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7544 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0418, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.9629 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0498, min=0.0000, max=0.8242 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.9209 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.8638 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5352 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0534, min=0.0000, max=0.9551 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.6562 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0503, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9028 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.9609 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7563 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9614 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8608 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7559 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.9658 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6841 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0371, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3276 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.9688 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6938 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.9897 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4404 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0384, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4692 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0426, min=0.0000, max=0.9590 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8208 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0437, min=0.0000, max=0.9019 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8687 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.7300 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8716 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.6753 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8872 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.8281 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9453 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.7075 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.8721 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.8110 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8984 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0371, min=0.0000, max=0.8936 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7183 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.6816 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7734 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.7573 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.9067 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.6987 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6318 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.8662 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.8052 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8057 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.6689 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8213 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0207, min=0.0000, max=0.6611 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7031 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.8330 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6855 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.8247 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6606 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.5127 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6807 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.5376 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6143 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.8081 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4780 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.8608 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5474 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0485, min=0.0000, max=0.9453 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.7148 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0221, min=0.0000, max=0.7095 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5000 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.7129 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7783 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0208, min=0.0000, max=0.8755 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4917 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.7959 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5732 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.7222 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4912 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.8799 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5752 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.6973 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4622 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.9531 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5669 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.9448 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3789 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0512, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9595 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.6260 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9336 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.6636 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8638 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0179, min=0.0000, max=0.5977 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7334 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0218, min=0.0000, max=0.7749 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6299 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.9316 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5625 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5459 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.9082 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6216 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.8975 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7056 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0398, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8784 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0494, min=0.0000, max=0.8877 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0023, min=0.0000, max=0.9434 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.7026 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.5386 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.7329 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.7798 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.7891 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.7026 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.7593 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.7529 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.5298 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.6133 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6133 | |
Timestep 361.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.8638 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0236, min=0.0000, max=0.8354 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.7344 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.8921 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.9082 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.7061 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.9312 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6792 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0088, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.7451 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6626 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.8584 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.9409 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0069, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.7842 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8018 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9458 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7646 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0421, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.9414 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.8276 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.9062 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0489, min=0.0000, max=0.8657 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5366 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0535, min=0.0000, max=0.9556 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.6230 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0504, min=0.0000, max=0.9536 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9028 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.9634 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7407 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9619 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8770 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7202 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0375, min=0.0000, max=0.9653 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6270 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0373, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3188 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.9712 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7046 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4475 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.3972 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.9668 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7959 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0440, min=0.0000, max=0.9038 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8633 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.7393 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8696 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.6792 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8750 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.8296 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9468 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.7178 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0056, min=0.0000, max=0.8721 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.8135 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.8535 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.8940 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7236 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6812 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7632 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.7617 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.9014 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7002 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6011 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.8682 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8394 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.8237 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.7915 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.6699 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7856 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0208, min=0.0000, max=0.6621 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7212 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.8311 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6338 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.8325 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6299 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.5312 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6904 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.5356 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5747 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8164 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4973 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.8647 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5317 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9468 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.6914 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.7168 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4873 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0234, min=0.0000, max=0.7285 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7554 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.8843 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4780 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.7905 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5146 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.7417 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4893 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5518 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.7139 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4375 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.9561 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6724 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9487 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3735 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0515, min=0.0000, max=0.9668 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9541 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.6450 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9141 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.6685 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8623 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0179, min=0.0000, max=0.5947 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7295 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.7754 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6479 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.9307 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5034 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.8848 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5493 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.9053 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7080 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.9019 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6968 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8784 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0494, min=0.0000, max=0.8882 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.9277 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.7061 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.5537 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.7178 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.7627 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.7695 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.6787 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.7603 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.7432 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.5332 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.6064 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6216 | |
Timestep 381.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.8481 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.8350 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.6987 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.8896 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.9209 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.7007 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.9185 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6802 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0087, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.7510 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6460 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.8662 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.9360 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.7676 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8096 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.9453 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7803 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0423, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.8960 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0501, min=0.0000, max=0.8301 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.8779 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0490, min=0.0000, max=0.8672 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5259 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0537, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.5508 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.9531 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8970 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.9653 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7246 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9634 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8940 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6797 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9653 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5654 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0376, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3059 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.9736 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7236 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4446 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3735 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.9722 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7637 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.9058 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8569 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.7480 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.8647 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.6826 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8574 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.8315 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9458 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.7271 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8730 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.8140 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8374 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0373, min=0.0000, max=0.8936 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7212 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6826 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7515 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.7651 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8965 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7046 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6143 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.8691 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8267 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.8374 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.6763 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7593 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.6636 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7007 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0208, min=0.0000, max=0.8286 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6206 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.8398 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6128 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.5469 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6733 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.5498 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5220 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.8198 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5093 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8696 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5283 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0488, min=0.0000, max=0.9478 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.6694 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.7324 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4675 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.7456 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7256 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4624 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.7866 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4934 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0207, min=0.0000, max=0.7598 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4800 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0208, min=0.0000, max=0.9038 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5195 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.7285 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4111 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0366, min=0.0000, max=0.9595 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.8071 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0414, min=0.0000, max=0.9517 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3604 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0517, min=0.0000, max=0.9668 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9482 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.6592 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8745 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.6699 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8594 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.5889 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7100 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0220, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6641 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0251, min=0.0000, max=0.9287 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.4949 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.8892 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5327 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.9014 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.8335 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.8955 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6797 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0494, min=0.0000, max=0.8882 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8979 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.7080 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.5752 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.7065 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.7397 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.7598 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.6733 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.7598 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.7310 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.5371 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.5947 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6270 | |
Timestep 401.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.8276 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0234, min=0.0000, max=0.8335 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.6548 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.8872 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.9292 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.6924 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.9014 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6787 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0086, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.7563 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6230 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8716 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.9292 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.7793 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8159 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9468 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7886 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0426, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.8032 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0503, min=0.0000, max=0.8315 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.8145 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.8672 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4922 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0538, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.4753 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.9521 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8853 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.9663 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7075 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9644 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.9062 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6338 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0382, min=0.0000, max=0.9663 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5024 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.2891 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7271 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0366, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4314 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0396, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3975 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0438, min=0.0000, max=0.9756 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7241 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0446, min=0.0000, max=0.9111 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8481 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.7554 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8550 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.6816 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8359 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8330 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9429 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.7334 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.8809 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.8135 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8271 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0374, min=0.0000, max=0.8926 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7041 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6812 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7329 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.7661 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8677 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7080 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6309 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0431, min=0.0000, max=0.8691 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8120 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.8442 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.7554 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.6865 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7285 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.6626 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6553 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.8252 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6162 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.8457 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5864 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.5605 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7090 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.5625 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.4670 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.8164 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5093 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.8730 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5420 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0490, min=0.0000, max=0.9512 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.6606 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.7441 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.4402 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0239, min=0.0000, max=0.7588 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6870 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.8984 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4521 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.7783 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4937 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.7734 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4656 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.9102 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4844 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.7397 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3940 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.9614 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0418, min=0.0000, max=0.9536 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3882 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0519, min=0.0000, max=0.9663 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9409 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.6709 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8242 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.6680 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8574 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.5942 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6646 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0220, min=0.0000, max=0.7671 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6748 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.9297 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4778 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5474 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.8970 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.9277 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8872 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6626 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.8818 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8750 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.8877 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0022, min=0.0000, max=0.8433 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.7100 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.5894 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.7026 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.7134 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.7617 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.6597 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.7568 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.7236 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.5400 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.5854 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.6294 | |
Timestep 421.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.7983 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.8320 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.6143 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.8843 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.9326 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.6782 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.9033 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6763 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0086, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.7612 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6118 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.8779 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.9214 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9795 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.7896 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8237 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.9517 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7920 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0428, min=0.0000, max=0.9917 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7236 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.8330 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.7769 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.8662 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4277 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0540, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.4116 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0506, min=0.0000, max=0.9507 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8701 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.9663 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6963 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9663 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.9160 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5913 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.9673 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4426 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3066 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.9751 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7271 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4487 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4094 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.9780 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6792 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.9150 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8320 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.7603 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8389 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.6816 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8174 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.8340 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9331 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.7402 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8906 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.8184 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8257 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0374, min=0.0000, max=0.8916 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6733 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6772 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7202 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.7705 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8169 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7095 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6353 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0431, min=0.0000, max=0.8691 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8091 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.8467 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.7261 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0203, min=0.0000, max=0.6987 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6938 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.6646 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6426 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.8203 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6196 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.8521 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5522 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.5713 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7324 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0214, min=0.0000, max=0.5732 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.4236 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.8271 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4983 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.8770 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5508 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.9551 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6299 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.7554 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.4250 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.7710 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6460 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.9028 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4312 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.7695 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4802 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0214, min=0.0000, max=0.7856 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4453 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.9082 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4771 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.7466 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4041 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0373, min=0.0000, max=0.9634 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9038 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0422, min=0.0000, max=0.9551 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3948 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0521, min=0.0000, max=0.9663 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9321 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.6802 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.7935 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0189, min=0.0000, max=0.6709 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8560 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.6094 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5952 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0221, min=0.0000, max=0.7773 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6797 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.9380 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4731 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5522 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.8906 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.9644 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8892 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6484 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8716 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.8877 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.7505 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.7070 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.5967 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.6953 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.6851 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7622 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.6377 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.7524 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.7124 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.5430 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5757 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.6279 | |
Timestep 441.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.7656 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.8306 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5669 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8804 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.9336 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.6606 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.9131 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6714 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0085, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.7646 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6021 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.8828 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.9131 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9756 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.7998 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8364 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0387, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7881 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0431, min=0.0000, max=0.9927 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7129 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0506, min=0.0000, max=0.8330 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7344 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.8643 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3364 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0541, min=0.0000, max=0.9570 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3789 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9497 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8633 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.9648 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6821 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9673 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.9214 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9883 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5522 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9683 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.3994 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0382, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.3579 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.9761 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7246 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0373, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4365 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4229 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0447, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6289 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0454, min=0.0000, max=0.9170 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8076 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.7627 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8135 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.6802 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8262 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.8350 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9170 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7446 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8945 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.8271 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8257 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0375, min=0.0000, max=0.8911 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6475 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6719 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7422 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.7739 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7822 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7085 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6289 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.8701 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8223 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0207, min=0.0000, max=0.8477 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.6821 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.7090 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6553 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.6665 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5884 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0217, min=0.0000, max=0.8154 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6460 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.8613 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5054 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.5801 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7505 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.5781 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.3826 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.8506 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5176 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.8794 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5522 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0494, min=0.0000, max=0.9585 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.5757 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.7656 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4133 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.7837 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6006 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.9058 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4058 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.7632 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4529 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0218, min=0.0000, max=0.7954 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4199 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4692 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.7515 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4246 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0377, min=0.0000, max=0.9653 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8848 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0427, min=0.0000, max=0.9556 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3674 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0523, min=0.0000, max=0.9673 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9219 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.6885 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.7524 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0190, min=0.0000, max=0.6802 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8530 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.6177 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5664 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.7920 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6812 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.9424 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5029 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0272, min=0.0000, max=0.8862 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5474 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8838 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.9565 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.8916 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6279 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0400, min=0.0000, max=0.8838 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.8652 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.8882 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.6797 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.6992 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.5967 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.6870 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.6934 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7617 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.6289 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.7471 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.6992 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.5454 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5630 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.6221 | |
Timestep 461.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.7681 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0230, min=0.0000, max=0.8281 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5234 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.8770 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.9292 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.6396 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.9194 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.6665 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0084, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.7690 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6001 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.8853 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.9028 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9692 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.8091 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8384 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9604 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7783 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7148 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.8325 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.6587 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0497, min=0.0000, max=0.8662 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2766 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0542, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3738 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9487 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8521 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.9634 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6621 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9683 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.9214 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.9883 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5454 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9692 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3882 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0384, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.3999 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.9805 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7217 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0376, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4141 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4417 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0452, min=0.0000, max=0.9785 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5801 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0458, min=0.0000, max=0.9170 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.7710 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7627 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.7725 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0200, min=0.0000, max=0.6748 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8237 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.8364 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9028 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7476 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8926 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.8359 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8213 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0376, min=0.0000, max=0.8940 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6396 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6650 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7510 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.7764 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7710 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7109 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6084 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.8696 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8301 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0208, min=0.0000, max=0.8462 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.6362 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0207, min=0.0000, max=0.7153 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6196 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.6685 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5103 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0220, min=0.0000, max=0.8242 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6582 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.8711 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4597 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.5859 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7622 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0218, min=0.0000, max=0.5767 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.3647 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.8682 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5181 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5381 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.9614 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.5518 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.7739 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.3962 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.7964 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5557 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.9067 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4077 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.7759 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4180 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.8003 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3945 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.8618 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4626 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.7612 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4529 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.9668 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.8105 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0431, min=0.0000, max=0.9541 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3247 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0524, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9077 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.6934 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7480 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0190, min=0.0000, max=0.6958 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0180, min=0.0000, max=0.6143 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5923 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.7964 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6729 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.9404 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5718 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.8833 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5405 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.8760 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.9175 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.8931 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6011 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0401, min=0.0000, max=0.8867 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.8579 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0495, min=0.0000, max=0.8877 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0021, min=0.0000, max=0.7173 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.6870 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.5942 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.6777 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.6899 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7598 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.6294 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7402 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.7065 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.5474 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5474 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.6133 | |
Timestep 481.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.7705 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.8257 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.4832 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.8711 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.9214 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.6123 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.9263 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6641 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0083, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.7734 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6138 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.8887 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.8950 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9653 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.8179 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8252 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9639 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7627 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7173 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0509, min=0.0000, max=0.8306 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.6523 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0498, min=0.0000, max=0.8682 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2769 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0543, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3840 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9473 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8384 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.9673 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6289 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.9683 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.9199 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5386 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.9702 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3752 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.9858 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4121 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0358, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6943 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4141 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0415, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4580 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0457, min=0.0000, max=0.9775 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5356 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0462, min=0.0000, max=0.9150 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7217 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7612 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.7451 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0201, min=0.0000, max=0.6650 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8047 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.8394 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.9067 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.7490 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.8804 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.8428 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8145 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0377, min=0.0000, max=0.8999 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6313 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6709 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7451 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.7759 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7598 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7124 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5859 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.8672 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8306 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.8442 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.6099 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.7202 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5928 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.6729 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4783 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.8291 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6235 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.8862 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4187 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.5894 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7710 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.5684 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.3706 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4973 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.8828 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5059 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0498, min=0.0000, max=0.9624 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.5303 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0236, min=0.0000, max=0.7827 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.3765 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.8091 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5103 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.9077 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4077 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0241, min=0.0000, max=0.7915 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3779 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.8022 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3757 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0228, min=0.0000, max=0.8481 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4587 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.7656 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4812 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9692 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7100 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.9521 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3335 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0526, min=0.0000, max=0.9668 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.8911 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.6953 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7598 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0190, min=0.0000, max=0.7119 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8540 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.6025 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5698 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.7944 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6582 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.9341 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6421 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.8799 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5210 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.8677 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.8940 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5898 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0402, min=0.0000, max=0.8901 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.8467 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.8877 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.7524 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.6724 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.5859 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.6777 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.6758 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7568 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.6167 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7300 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.7246 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.5483 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5161 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.6055 | |
Timestep 501.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.7710 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.8208 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.4919 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.8647 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.9019 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.5938 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.9292 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6606 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0082, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.7769 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6270 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.8896 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.8877 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.9639 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.8252 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8018 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0394, min=0.0000, max=0.9668 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7393 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0437, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7295 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0510, min=0.0000, max=0.8296 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.6431 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.8691 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2795 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0544, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3896 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9463 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.8232 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.9707 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6387 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9688 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.9146 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5161 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0398, min=0.0000, max=0.9712 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3921 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0387, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3713 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6562 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0384, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4558 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0421, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4785 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0462, min=0.0000, max=0.9766 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5073 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0466, min=0.0000, max=0.9111 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7290 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.7573 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.7197 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0202, min=0.0000, max=0.6479 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7744 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.8423 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9141 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7476 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8823 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.8457 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8032 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9048 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6309 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.6924 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7202 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.7710 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7046 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7129 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5767 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.8623 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8242 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.8428 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.5840 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.7275 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5786 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.6772 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.4819 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.8271 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5645 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.9014 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3933 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.5918 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0221, min=0.0000, max=0.5547 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.3848 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4607 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4590 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9614 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.5371 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0238, min=0.0000, max=0.7915 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.3623 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.8198 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4653 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.9102 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3826 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.7993 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3481 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0230, min=0.0000, max=0.7998 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3704 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.8389 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4558 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.7578 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5088 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0387, min=0.0000, max=0.9717 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5845 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0440, min=0.0000, max=0.9497 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3696 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0527, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0059, min=0.0000, max=0.8755 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.6953 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7734 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0191, min=0.0000, max=0.7261 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8550 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.5864 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5186 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.7866 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6401 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.9268 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6743 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.8755 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5220 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.8550 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7603 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.8940 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5259 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.8291 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0497, min=0.0000, max=0.8877 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0020, min=0.0000, max=0.7764 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.6777 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.5713 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.6807 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.6577 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7520 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.6138 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7178 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.7451 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.5483 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.4768 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.6104 | |
Timestep 521.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.7715 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.8159 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5078 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.8564 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8701 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.5928 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.9282 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6577 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0081, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.7803 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6333 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.8892 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.8955 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.9575 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.8311 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7939 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.9688 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7080 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0438, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7285 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0511, min=0.0000, max=0.8301 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6304 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0501, min=0.0000, max=0.8730 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2766 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0545, min=0.0000, max=0.9556 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3850 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9453 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8125 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.9731 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6660 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.9678 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.9058 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4622 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.9722 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4014 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9839 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.2981 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6055 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0386, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4670 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0426, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4854 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0466, min=0.0000, max=0.9766 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5376 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0470, min=0.0000, max=0.9048 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7456 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7524 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.7090 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.6265 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7842 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8452 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9121 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.7437 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8848 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0350, min=0.0000, max=0.8423 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8057 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9077 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6289 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.7095 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6782 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.7637 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6738 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7119 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5615 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8584 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8130 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.8398 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.5674 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.7466 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5767 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0228, min=0.0000, max=0.6924 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.4780 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.8198 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4988 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.9121 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3857 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0214, min=0.0000, max=0.5933 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.7754 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.5405 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4109 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4106 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4419 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.9575 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.5259 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.7983 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3416 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8130 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4177 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.9097 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3342 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.8018 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3398 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.8027 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3657 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0236, min=0.0000, max=0.8291 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4690 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.7275 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5381 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9731 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6040 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0444, min=0.0000, max=0.9468 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4031 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0529, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0058, min=0.0000, max=0.8618 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.6929 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7783 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0191, min=0.0000, max=0.7393 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.5698 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5601 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0228, min=0.0000, max=0.7773 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6167 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.9229 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6660 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.8721 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5098 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.8403 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6606 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5146 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.8013 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0497, min=0.0000, max=0.8867 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.7920 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.6807 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.5508 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.6836 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.6509 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.7466 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.5967 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.6958 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.7568 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.5488 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.4180 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.6187 | |
Timestep 541.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.7837 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.8096 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5161 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.8452 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8174 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.5933 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.9214 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6533 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0080, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.7832 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6362 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.8862 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0054, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.9048 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.9434 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8369 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7798 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.9707 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6714 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0440, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7119 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0512, min=0.0000, max=0.8306 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6040 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.8760 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2979 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0546, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3696 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9458 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8032 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.9756 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6855 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9673 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8862 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4338 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0402, min=0.0000, max=0.9736 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4084 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.2952 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5605 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4609 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4785 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0469, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5215 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0473, min=0.0000, max=0.9023 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7646 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7466 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.6948 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.6138 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8086 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.8481 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.9072 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7354 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8857 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.8379 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7974 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.9097 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6294 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.7168 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5874 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0364, min=0.0000, max=0.7656 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6753 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7104 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5454 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8569 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.7896 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.8394 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.5786 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0218, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5908 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.7114 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5376 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.8091 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4443 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.9229 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3853 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.6021 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.5645 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4441 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3618 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.8770 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4536 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0504, min=0.0000, max=0.9590 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.5112 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.8052 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.3381 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.8105 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3748 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0239, min=0.0000, max=0.9092 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2876 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.8013 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2908 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.8135 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3708 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.8179 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4839 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.7080 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5664 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9741 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6230 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0448, min=0.0000, max=0.9438 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4260 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0529, min=0.0000, max=0.9683 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0057, min=0.0000, max=0.8530 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.7012 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7759 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0191, min=0.0000, max=0.7500 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8496 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.5552 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5801 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.7720 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5957 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.9185 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6206 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.8677 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5137 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.8413 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5835 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5127 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7529 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0498, min=0.0000, max=0.8867 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0019, min=0.0000, max=0.7979 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.6792 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.5146 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.6860 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.6230 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.7407 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.5161 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.6592 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.7622 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.5493 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.3484 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0350, min=0.0000, max=0.6245 | |
Timestep 561.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.7905 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.8003 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5098 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.8262 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.7896 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.5928 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.9048 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6431 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0079, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.7876 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6299 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.8833 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.9170 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.9199 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.8496 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7583 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0402, min=0.0000, max=0.9727 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6211 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6763 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0514, min=0.0000, max=0.8306 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5693 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0504, min=0.0000, max=0.8770 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3201 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0547, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3445 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9463 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8066 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.9775 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6963 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9678 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8574 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4209 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.9741 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4189 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.2871 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0386, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5229 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0391, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4419 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0437, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5005 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0472, min=0.0000, max=0.9814 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5220 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0477, min=0.0000, max=0.9038 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7715 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.7393 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.6641 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0206, min=0.0000, max=0.6323 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7939 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.8516 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8994 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7241 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8882 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0354, min=0.0000, max=0.8345 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.7837 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.9097 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6284 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.7100 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5474 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.7671 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6665 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7095 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5308 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8545 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7886 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0217, min=0.0000, max=0.8413 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.6113 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0221, min=0.0000, max=0.8013 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5835 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0234, min=0.0000, max=0.7324 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5991 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.7974 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4673 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.9336 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3843 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0218, min=0.0000, max=0.5996 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7671 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.6045 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4729 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.8784 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3049 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.8740 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4670 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.9609 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.5029 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.8105 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3420 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.7939 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.3721 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.9111 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2551 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.7993 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2595 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.8213 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3809 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.8159 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4939 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.7383 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5884 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0394, min=0.0000, max=0.9751 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6387 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0452, min=0.0000, max=0.9424 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4446 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0530, min=0.0000, max=0.9688 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0055, min=0.0000, max=0.8628 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.7070 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7661 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.7603 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8354 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.5415 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5645 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0230, min=0.0000, max=0.7729 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5718 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.9155 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5410 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.8657 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5122 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.8433 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5796 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5127 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.8657 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6636 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0498, min=0.0000, max=0.8872 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0018, min=0.0000, max=0.7881 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.6733 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.4724 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.6875 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.5884 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.7358 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.5063 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.6416 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.7432 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.5488 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.2769 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0350, min=0.0000, max=0.6289 | |
Timestep 581.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.8018 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.7876 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.4773 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.8027 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.7544 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.5923 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.8740 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.6260 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0078, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.7900 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6226 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.8936 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.9302 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.9033 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.8594 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7637 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.9736 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5708 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0445, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6333 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0515, min=0.0000, max=0.8306 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5244 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.8770 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3367 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0548, min=0.0000, max=0.9556 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3103 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9468 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8306 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7026 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9692 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8071 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.9854 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4028 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4304 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9883 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.2695 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5132 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0391, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4226 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0442, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4846 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0475, min=0.0000, max=0.9829 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5303 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0479, min=0.0000, max=0.9043 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7598 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.7393 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.6367 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0208, min=0.0000, max=0.6494 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7104 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.8530 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8901 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.7104 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.8872 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.8335 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7847 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9077 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6255 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.6855 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5254 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0366, min=0.0000, max=0.7656 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6445 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.7080 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5596 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8530 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8076 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0219, min=0.0000, max=0.8457 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.6367 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.8198 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6152 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0238, min=0.0000, max=0.7563 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6094 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.7886 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4866 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.9478 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3962 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0221, min=0.0000, max=0.6187 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7515 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0230, min=0.0000, max=0.6318 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4529 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.8687 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.2937 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.8760 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4685 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9624 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5278 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0248, min=0.0000, max=0.8184 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3450 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.7837 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3782 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.9141 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2588 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.8066 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2573 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8271 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3838 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.8281 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4998 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.7651 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6060 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.9761 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6523 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0457, min=0.0000, max=0.9468 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4475 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0531, min=0.0000, max=0.9688 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0053, min=0.0000, max=0.8384 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.7085 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.7368 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.7681 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8135 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.5347 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4939 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0232, min=0.0000, max=0.7744 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5439 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.9131 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5376 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.8652 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4944 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.8447 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7544 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5073 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0406, min=0.0000, max=0.8604 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6875 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0499, min=0.0000, max=0.8877 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.7495 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.6606 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.4290 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.6909 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.5996 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.7471 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.5571 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.6196 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.7544 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.5479 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.2571 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.6313 | |
Timestep 601.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.8110 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.7793 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.4329 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.7715 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.7051 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.5894 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.8213 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.6001 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0077, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.7925 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6045 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.9062 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.9414 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8843 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.8730 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7646 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.9751 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5054 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0446, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5962 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0516, min=0.0000, max=0.8315 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4705 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0506, min=0.0000, max=0.8755 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3513 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0548, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.2673 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9458 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8506 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.9800 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7012 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9702 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7129 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3555 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4470 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.2412 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0398, min=0.0000, max=0.9917 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5239 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0390, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3970 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0446, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4209 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0477, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5444 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0482, min=0.0000, max=0.9077 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7368 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.7451 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.5952 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0210, min=0.0000, max=0.6685 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6982 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.8530 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8770 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.6934 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8882 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.8330 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8223 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.9048 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6167 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6621 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5347 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0367, min=0.0000, max=0.7578 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6196 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7065 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5762 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8506 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.7871 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.6362 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.8486 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6138 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.7837 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6431 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0251, min=0.0000, max=0.7866 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4658 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.9624 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3577 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.6426 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.7314 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.6104 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4229 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8647 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2949 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4644 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0511, min=0.0000, max=0.9624 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5063 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.8296 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3423 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.7944 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3726 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.9209 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2612 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8159 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2495 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.8315 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3831 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8442 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4998 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.7881 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6172 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0401, min=0.0000, max=0.9775 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6592 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0463, min=0.0000, max=0.9521 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4397 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0532, min=0.0000, max=0.9707 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0051, min=0.0000, max=0.8071 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.7085 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7036 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0193, min=0.0000, max=0.7749 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7837 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0183, min=0.0000, max=0.5293 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4998 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.7798 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5220 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.9121 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5239 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.8672 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4312 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.8462 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6597 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.8960 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5342 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.8818 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6943 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.8882 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0017, min=0.0000, max=0.6602 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.6416 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.3787 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.6987 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.6113 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7583 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.5840 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.5991 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.7417 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.5469 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.2250 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.6313 | |
Timestep 621.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.8174 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.7788 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.4363 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.7329 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.6440 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.5874 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.7856 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.5718 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0076, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.7944 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5889 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.9180 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.9487 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.8726 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.8867 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7485 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.9756 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4380 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0449, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5518 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0517, min=0.0000, max=0.8320 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4551 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.8745 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3604 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0549, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.2434 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9443 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8647 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0272, min=0.0000, max=0.9810 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6987 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9707 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6040 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.9844 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3301 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0406, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4712 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0390, min=0.0000, max=0.9902 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2355 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4648 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0390, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3870 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0449, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4006 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0479, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5498 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0484, min=0.0000, max=0.9087 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7026 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7476 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.5591 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0211, min=0.0000, max=0.6846 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6973 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.8525 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8813 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.6733 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8867 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.8306 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7808 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0386, min=0.0000, max=0.9009 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6060 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6577 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6567 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0367, min=0.0000, max=0.7500 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6289 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.7051 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5840 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0428, min=0.0000, max=0.8516 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7788 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.8682 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.6333 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.8828 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6021 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.8008 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6606 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8130 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4370 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9692 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3352 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.6660 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.7026 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.6011 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3987 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8667 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2832 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0367, min=0.0000, max=0.8906 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4595 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0515, min=0.0000, max=0.9609 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.4380 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8389 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3462 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.8057 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3555 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.9258 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2566 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.8257 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2393 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.8481 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3789 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.8608 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4958 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.8003 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6245 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.9780 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6689 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0469, min=0.0000, max=0.9551 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4392 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0534, min=0.0000, max=0.9731 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.7559 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.7021 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7036 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.7783 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7896 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0183, min=0.0000, max=0.5239 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5020 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0238, min=0.0000, max=0.7881 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5034 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9111 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5010 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.8701 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4238 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.8467 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6987 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.9014 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5713 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.7036 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0501, min=0.0000, max=0.8882 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0016, min=0.0000, max=0.5190 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.6157 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.3289 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7002 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.6230 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7686 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.5864 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.5947 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.7095 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.5449 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.1902 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.6304 | |
Timestep 641.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.8218 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0223, min=0.0000, max=0.7754 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.4280 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.6880 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.5693 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.5845 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.7627 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.5791 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0075, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.7954 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5708 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.9326 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.9556 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.8604 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.8999 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7261 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0414, min=0.0000, max=0.9771 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4302 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5112 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0518, min=0.0000, max=0.8315 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4626 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.8765 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3635 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0550, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2283 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9419 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8750 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.9819 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7041 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9717 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6294 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.9844 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2771 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.9741 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4817 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9912 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2520 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4307 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3862 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0452, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4065 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0480, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5498 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9092 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6489 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.7485 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.5273 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0213, min=0.0000, max=0.6973 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6914 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.8501 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8848 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.6704 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8853 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.8271 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7334 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9004 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.5972 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6528 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6572 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.7441 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6187 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.7026 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5752 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8511 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.7969 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.8818 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6123 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0234, min=0.0000, max=0.9048 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6343 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.8066 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6543 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.8267 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4382 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.9727 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3381 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.6797 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6699 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.6138 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4104 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.8843 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3318 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0371, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4512 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0518, min=0.0000, max=0.9604 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.4453 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.8472 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3486 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.8179 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3447 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.9375 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2422 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8359 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2369 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.8633 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3711 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4844 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.8086 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6255 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.9795 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6719 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0474, min=0.0000, max=0.9575 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4377 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0535, min=0.0000, max=0.9756 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0048, min=0.0000, max=0.7627 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.6929 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7114 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.7856 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.7905 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.5161 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5044 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.7935 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4895 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.9097 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5049 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.8770 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4185 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.8501 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7261 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.9048 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6074 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.8892 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7080 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.8887 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.4016 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.6016 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.2812 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6973 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.6377 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7749 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.5713 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.5938 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.6655 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.5410 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.1522 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.6294 | |
Timestep 661.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8267 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.7671 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.4070 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.6455 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.5815 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.5791 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.7305 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.5918 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0074, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.7954 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5566 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.9409 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.9600 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.8413 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.9111 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7476 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4702 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0452, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4644 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0520, min=0.0000, max=0.8315 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4580 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0509, min=0.0000, max=0.8784 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3630 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0551, min=0.0000, max=0.9570 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2281 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9395 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8823 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.9819 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6943 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9717 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6714 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.9834 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2881 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.9731 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4849 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0393, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2588 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0414, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4514 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0387, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3882 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0453, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3972 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0480, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5483 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9136 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5942 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.7505 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.5127 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0214, min=0.0000, max=0.7119 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6841 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.8516 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8745 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.6694 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8828 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0364, min=0.0000, max=0.8218 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7148 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9028 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5874 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6470 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6416 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.7383 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5688 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.7007 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5522 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8501 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7612 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.8843 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4895 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.9131 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7300 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.8271 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6270 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.8315 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3938 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.9756 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3979 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.6904 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6431 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.6138 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4714 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.8984 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4600 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0376, min=0.0000, max=0.8960 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4626 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0520, min=0.0000, max=0.9604 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4424 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.8521 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3418 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.8291 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3364 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9453 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2247 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.8496 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2322 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.8745 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3645 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.8901 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4746 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.8193 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.6270 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.9810 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6772 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0479, min=0.0000, max=0.9585 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4368 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0536, min=0.0000, max=0.9775 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.7729 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.6787 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7134 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.7915 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7817 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.5107 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5039 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.7993 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4792 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.9082 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5146 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4104 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.8550 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7500 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.9053 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6338 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.8906 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7080 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0503, min=0.0000, max=0.8901 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.3708 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.5967 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.2461 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6938 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.6362 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7788 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.5093 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.5908 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.6270 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.5366 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1238 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.6294 | |
Timestep 681.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8306 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.7607 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.3774 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.6587 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.5913 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.5723 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.6880 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.6045 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9839 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.7954 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5435 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.9482 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.9785 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.9634 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8198 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.9204 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7666 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0421, min=0.0000, max=0.9810 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4980 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0454, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4360 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0521, min=0.0000, max=0.8306 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4412 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0510, min=0.0000, max=0.8804 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3577 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0552, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2294 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9380 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8857 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9829 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6675 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9712 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6929 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.9824 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3074 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9727 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4719 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0397, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2756 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4211 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0386, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3845 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0455, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3152 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0482, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5259 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.9150 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5962 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.7544 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.5278 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0215, min=0.0000, max=0.7246 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6772 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.8579 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8652 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6724 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8794 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0367, min=0.0000, max=0.8164 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.6895 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0391, min=0.0000, max=0.9048 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5723 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.6479 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4338 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.7402 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5161 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.6997 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5210 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8506 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.5698 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4233 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.9038 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6826 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.8530 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7300 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8423 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3486 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.9810 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5508 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.7061 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.6123 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0253, min=0.0000, max=0.6118 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5742 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.9097 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.6167 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4653 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0522, min=0.0000, max=0.9604 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4292 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8574 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3337 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.8413 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3120 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9502 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2101 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.8613 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2252 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.8828 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3545 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.9028 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4592 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.8315 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.6211 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0414, min=0.0000, max=0.9819 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6763 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0483, min=0.0000, max=0.9585 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4370 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0538, min=0.0000, max=0.9785 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.7803 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.6572 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.7139 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.7944 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7598 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.5054 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5024 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8081 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4700 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.9067 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5190 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.8931 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4004 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7568 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.9058 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6426 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0414, min=0.0000, max=0.8877 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.6987 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0503, min=0.0000, max=0.8911 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0015, min=0.0000, max=0.3623 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.5986 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.2169 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6953 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.6138 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7754 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.3997 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.5840 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5767 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.5298 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1077 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0353, min=0.0000, max=0.6289 | |
Timestep 701.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8335 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0224, min=0.0000, max=0.7485 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.3562 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6792 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.5913 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.5620 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.6353 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.6152 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0073, min=0.0000, max=0.9834 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.7949 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5317 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9521 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.9712 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.9692 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7925 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.9268 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.7842 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0424, min=0.0000, max=0.9824 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5171 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0455, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4287 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0522, min=0.0000, max=0.8306 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4036 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0511, min=0.0000, max=0.8818 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3518 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0554, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.2330 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9360 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8853 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9829 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6509 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9697 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6982 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.9819 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2791 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.9785 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4546 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2908 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0424, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3569 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0384, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3831 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0455, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.2418 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0482, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5161 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.9146 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6016 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.7588 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.5400 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0217, min=0.0000, max=0.7393 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6694 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.8662 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8359 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0314, min=0.0000, max=0.6758 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8750 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.8115 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6631 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0393, min=0.0000, max=0.9062 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5552 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6494 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4165 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.7427 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4841 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0407, min=0.0000, max=0.6987 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4783 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8501 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.6997 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.8804 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.3965 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.8853 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6206 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.8740 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6533 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.8477 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3267 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9844 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5024 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.7158 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5825 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0255, min=0.0000, max=0.6279 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4956 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.9219 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.6152 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4382 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0524, min=0.0000, max=0.9614 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4019 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0272, min=0.0000, max=0.8594 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3789 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.8530 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2720 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.9526 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2130 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.8687 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2186 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.8853 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3477 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.9131 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4456 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.8418 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.6172 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0416, min=0.0000, max=0.9824 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6768 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.9570 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4370 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0538, min=0.0000, max=0.9790 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7837 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.6328 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7168 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0197, min=0.0000, max=0.7920 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.7261 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0185, min=0.0000, max=0.4951 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5034 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.8159 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4561 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.9038 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5254 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.8931 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3943 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.8545 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7407 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.9053 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6484 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0415, min=0.0000, max=0.8813 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6821 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0504, min=0.0000, max=0.8916 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.3503 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.5923 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.2161 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6924 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.5728 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7642 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.3357 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.5713 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5117 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.5215 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0919 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0354, min=0.0000, max=0.6260 | |
Timestep 721.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8345 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0226, min=0.0000, max=0.7339 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.3232 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.6938 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5933 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.5557 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.6011 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.6265 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0072, min=0.0000, max=0.9829 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.7920 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5249 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.9712 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.9736 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.7549 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.9297 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7930 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0427, min=0.0000, max=0.9834 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5181 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0456, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4160 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0523, min=0.0000, max=0.8301 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3491 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0511, min=0.0000, max=0.8823 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3428 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0555, min=0.0000, max=0.9575 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.2465 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9341 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8813 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.9829 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6353 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9702 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6719 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.9819 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2147 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9819 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4326 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0402, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3008 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0427, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3474 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3752 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0454, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2629 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0482, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4946 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9204 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6050 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.7617 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.5562 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0218, min=0.0000, max=0.7656 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6602 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.8745 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.8184 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.6797 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8682 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.8062 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6538 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.9072 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5352 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6484 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4055 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0371, min=0.0000, max=0.7393 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4736 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0407, min=0.0000, max=0.6958 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4529 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8501 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.7612 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.8818 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.3755 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8555 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5176 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4282 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.8438 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2715 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9873 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2964 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.7202 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5635 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.6519 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3425 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.9253 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4016 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.8999 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4331 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0526, min=0.0000, max=0.9634 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.3889 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.8564 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3013 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.8618 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2086 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9526 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2122 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.8716 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2130 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3389 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.9204 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4297 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.8491 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.6079 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.9829 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6694 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0490, min=0.0000, max=0.9585 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4382 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0539, min=0.0000, max=0.9795 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0044, min=0.0000, max=0.7832 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.6006 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.7119 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.7847 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6704 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.4861 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5093 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8218 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4458 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.9019 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5303 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3845 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.8550 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.7251 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.9048 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6406 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0416, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6714 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0504, min=0.0000, max=0.8911 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0014, min=0.0000, max=0.3572 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.5820 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.2240 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6826 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.5234 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7412 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.3167 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.5557 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.4253 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.5122 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0772 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.6201 | |
Timestep 741.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.8384 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0227, min=0.0000, max=0.7139 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.2849 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.7041 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5825 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.5435 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.6382 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.6348 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0071, min=0.0000, max=0.9824 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.7905 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5186 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.9717 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0364, min=0.0000, max=0.9775 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.7075 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.9312 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7959 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.9844 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5024 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0457, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4011 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0525, min=0.0000, max=0.8296 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2896 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0513, min=0.0000, max=0.8833 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3318 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0556, min=0.0000, max=0.9585 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.2610 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9307 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8726 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9824 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6167 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9727 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6270 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.9829 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2115 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.9863 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4082 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3042 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3398 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3650 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0455, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2898 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0483, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4805 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.9233 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6099 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.7612 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.5625 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0220, min=0.0000, max=0.7930 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6504 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.8828 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8169 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.6851 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8604 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0376, min=0.0000, max=0.8027 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6631 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0398, min=0.0000, max=0.9116 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5151 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6494 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4111 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.7412 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4568 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0406, min=0.0000, max=0.6982 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4597 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.8496 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.7339 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0235, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.4106 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.8350 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3435 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0261, min=0.0000, max=0.8862 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2720 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.8398 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2844 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9893 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2888 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.7207 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5342 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.6729 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3367 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0353, min=0.0000, max=0.9258 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3608 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9077 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4219 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0527, min=0.0000, max=0.9658 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3696 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.8525 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2798 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.8696 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.1873 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.9492 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2078 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8809 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2056 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.8857 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3289 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.9297 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4128 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.8564 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.5996 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0418, min=0.0000, max=0.9829 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6616 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.9595 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4529 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0539, min=0.0000, max=0.9795 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.7925 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.5654 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7095 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0196, min=0.0000, max=0.7754 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5918 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0184, min=0.0000, max=0.4836 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5264 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8247 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4380 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.8994 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5361 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3770 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.8560 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6953 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.9043 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6426 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.8848 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6553 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.8916 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.3604 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.5806 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.2211 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6694 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.4761 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7212 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.2903 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.5581 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.3474 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.5034 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.0674 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.6089 | |
Timestep 761.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.8398 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0228, min=0.0000, max=0.6909 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.2363 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.7134 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5532 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.5308 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.6177 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.6411 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0070, min=0.0000, max=0.9810 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.7881 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5117 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.9565 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.9707 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.9810 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.6797 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.9292 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7876 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.9858 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4800 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0459, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3806 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0526, min=0.0000, max=0.8315 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2219 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0514, min=0.0000, max=0.8843 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3193 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0558, min=0.0000, max=0.9595 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0008, min=0.0000, max=0.2532 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9302 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8633 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9819 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5957 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5728 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2539 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9897 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3877 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3054 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0437, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3298 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3557 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0455, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2717 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0485, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4727 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.9219 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6104 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.7603 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5410 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0222, min=0.0000, max=0.8188 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6426 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.8931 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.8071 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6899 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.8516 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.7983 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6714 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0400, min=0.0000, max=0.9175 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5093 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.6499 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4194 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.7417 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4692 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0406, min=0.0000, max=0.7007 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4609 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.8486 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.6836 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0236, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.4807 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.8511 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3257 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.8916 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.2483 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.8389 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2810 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.9907 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2747 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.7212 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5176 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.6938 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3567 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.9331 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3677 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0390, min=0.0000, max=0.9136 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4075 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0529, min=0.0000, max=0.9692 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3418 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.8442 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2559 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.8750 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.1775 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.9448 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2059 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.8901 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2021 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3210 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9370 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4067 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.8652 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.5913 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.9839 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6514 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0494, min=0.0000, max=0.9600 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4578 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0540, min=0.0000, max=0.9790 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.8037 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.5547 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.7021 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.7632 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.5117 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0183, min=0.0000, max=0.4895 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5420 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.8193 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4219 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.8970 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5425 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.8911 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3682 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.8525 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6631 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.9033 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6592 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.8872 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6289 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0505, min=0.0000, max=0.8931 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.3489 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.5806 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.2051 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6729 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.4229 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.7114 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.2522 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.5605 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.2798 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.4951 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.0594 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.5938 | |
Timestep 781.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.8374 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.6748 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.2180 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.7153 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5557 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.5166 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.6152 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.6479 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0070, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.7847 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5083 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.9595 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9702 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0371, min=0.0000, max=0.9839 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.6699 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.9360 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.7729 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0437, min=0.0000, max=0.9883 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4385 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0460, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3616 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0527, min=0.0000, max=0.8354 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2008 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0515, min=0.0000, max=0.8848 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3042 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0558, min=0.0000, max=0.9600 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0008, min=0.0000, max=0.2426 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.9292 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8687 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9805 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5747 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.9751 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5220 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.9873 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3079 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.9927 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3740 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0414, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3003 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3811 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3418 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0456, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2109 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4673 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0488, min=0.0000, max=0.9282 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6548 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.7617 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.5117 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0225, min=0.0000, max=0.8403 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6309 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.9019 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7690 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.6929 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.8398 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.7925 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6484 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.9219 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4971 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.6562 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4456 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.7373 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4417 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0406, min=0.0000, max=0.7021 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4580 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.8486 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.6729 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.8726 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5200 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.8701 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3340 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.9126 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2472 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.8306 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2529 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.9917 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2603 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.7109 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4907 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.7119 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3586 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.9380 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3635 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0393, min=0.0000, max=0.9180 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3879 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0529, min=0.0000, max=0.9702 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.3364 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.8506 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2350 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.8789 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.1633 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.9463 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2012 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.8975 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2007 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.9023 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3159 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.9438 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4204 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.8711 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.5811 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.9849 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6416 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.9609 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4580 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0540, min=0.0000, max=0.9780 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.8120 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.5449 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6914 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.7480 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5269 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0182, min=0.0000, max=0.4902 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5557 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.8096 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4185 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5420 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.8940 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3494 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.8462 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6318 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.9023 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6641 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.6045 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0506, min=0.0000, max=0.8926 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0013, min=0.0000, max=0.3186 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.5845 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.1748 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6758 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.3706 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.7012 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.2255 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.5630 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.2542 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.4897 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.0488 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.5923 | |
Timestep 801.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.8389 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0231, min=0.0000, max=0.6685 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1904 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.7158 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5635 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.4990 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.6772 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.6519 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0069, min=0.0000, max=0.9814 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.7969 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4949 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9619 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.9673 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0375, min=0.0000, max=0.9858 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6592 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.9468 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7485 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0439, min=0.0000, max=0.9902 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3857 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0460, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3425 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0528, min=0.0000, max=0.8379 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2085 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0516, min=0.0000, max=0.8853 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.2915 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0560, min=0.0000, max=0.9604 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0008, min=0.0000, max=0.2356 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9287 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0039, min=0.0000, max=0.8750 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5513 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4739 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3252 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3613 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.9961 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2893 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3857 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3315 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0457, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2421 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0487, min=0.0000, max=0.9878 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4675 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0489, min=0.0000, max=0.9312 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6982 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.7637 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.5078 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0229, min=0.0000, max=0.8584 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6152 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.9092 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.7222 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.6914 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.8267 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0386, min=0.0000, max=0.8018 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6206 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.9248 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4780 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.6733 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4761 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.7368 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4119 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0406, min=0.0000, max=0.7041 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4551 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0431, min=0.0000, max=0.8564 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0041, min=0.0000, max=0.6831 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0239, min=0.0000, max=0.8711 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5156 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.8887 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3474 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.9326 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2435 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.8428 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2444 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.9927 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2485 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0250, min=0.0000, max=0.7588 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4668 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.7241 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3486 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.9429 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3611 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.9150 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3691 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0530, min=0.0000, max=0.9702 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.3367 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.8584 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2316 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.8799 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.1611 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.9492 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2019 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.9043 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2130 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9053 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3167 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.9492 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4341 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.8774 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.5688 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0420, min=0.0000, max=0.9849 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6460 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0497, min=0.0000, max=0.9609 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4521 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0540, min=0.0000, max=0.9775 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0040, min=0.0000, max=0.8027 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.5400 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6733 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.7300 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4626 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.5112 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5615 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.7861 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4163 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.8931 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5312 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.8994 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3206 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.8438 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5918 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.8999 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.6514 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0420, min=0.0000, max=0.8931 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.5703 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0508, min=0.0000, max=0.8901 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.2778 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.5869 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.1864 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6772 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.3130 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.6890 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.2003 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.5664 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.2402 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.4846 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.0417 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.5918 | |
Timestep 821.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.8359 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.6562 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1663 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.7104 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.5713 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.4844 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.7305 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.6562 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0068, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0313, min=0.0000, max=0.8130 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5068 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.9639 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.9624 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9868 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6743 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.9541 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.7217 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.9917 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3110 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0461, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3196 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0528, min=0.0000, max=0.8413 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2223 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0517, min=0.0000, max=0.8853 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.2734 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0561, min=0.0000, max=0.9614 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0007, min=0.0000, max=0.2314 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0507, min=0.0000, max=0.9277 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8804 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9766 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5317 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0279, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4272 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.9902 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3022 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3464 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0428, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2683 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0460, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3235 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3191 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0461, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2971 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0488, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4685 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0491, min=0.0000, max=0.9268 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7266 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.7637 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.5083 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0233, min=0.0000, max=0.8730 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5972 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.9160 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6719 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.6953 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.8096 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0391, min=0.0000, max=0.8198 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5698 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9272 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4604 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.6938 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4890 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.7407 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3850 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0407, min=0.0000, max=0.7065 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4441 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0433, min=0.0000, max=0.8638 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.6836 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0241, min=0.0000, max=0.8774 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.4719 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.9121 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2822 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.9575 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2399 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.8525 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2673 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.9893 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2363 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.8188 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4375 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.7256 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3733 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0366, min=0.0000, max=0.9482 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3489 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0400, min=0.0000, max=0.8877 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3535 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0532, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.3066 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.8691 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2302 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.8770 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.1549 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.9521 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.1942 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.9146 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.1985 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.9087 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3154 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.9556 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4446 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.8848 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.5483 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0422, min=0.0000, max=0.9849 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6479 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9614 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4512 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0541, min=0.0000, max=0.9766 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.7397 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.5435 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6646 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0194, min=0.0000, max=0.7129 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4517 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.5039 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5542 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.7661 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4021 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0287, min=0.0000, max=0.8989 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5225 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0295, min=0.0000, max=0.9048 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3186 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0351, min=0.0000, max=0.8428 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5557 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.8994 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6196 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0422, min=0.0000, max=0.8970 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.5479 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0509, min=0.0000, max=0.8896 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0012, min=0.0000, max=0.2756 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.5928 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.2443 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6836 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.2585 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.6738 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.1613 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.5688 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.2175 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.4827 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0003, min=0.0000, max=0.0327 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0358, min=0.0000, max=0.5884 | |
Timestep 841.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.8325 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0234, min=0.0000, max=0.6577 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1594 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.7007 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5664 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.4836 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.7710 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.6577 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0067, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.8257 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5342 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.9678 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.9546 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9883 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.6748 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.9600 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.6821 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0446, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2808 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0462, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2942 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0528, min=0.0000, max=0.8423 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2336 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0518, min=0.0000, max=0.8848 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.2583 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0562, min=0.0000, max=0.9619 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0007, min=0.0000, max=0.2234 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0506, min=0.0000, max=0.9258 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8818 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9722 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5059 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9727 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3948 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.9912 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2539 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0415, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3438 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2417 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0471, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3677 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0396, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3062 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0464, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3376 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0490, min=0.0000, max=0.9893 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4802 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.9214 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7393 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.7637 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4980 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.8853 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5801 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.9238 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.6504 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.6958 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.7847 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0396, min=0.0000, max=0.8257 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4797 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.9326 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4675 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.7100 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4199 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0374, min=0.0000, max=0.7466 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3831 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.7056 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4331 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0437, min=0.0000, max=0.8706 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0042, min=0.0000, max=0.6772 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.8896 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.4001 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.9243 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3711 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.9683 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2612 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.8486 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2462 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.9878 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2264 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8467 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3962 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.7104 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3406 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0373, min=0.0000, max=0.9536 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3293 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.8926 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3550 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0534, min=0.0000, max=0.9644 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.3567 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.8784 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2257 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.8687 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.1436 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.9580 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.1881 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.9175 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.1788 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.9150 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3091 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.9629 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4519 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.5249 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0425, min=0.0000, max=0.9849 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6455 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.9614 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.4365 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0541, min=0.0000, max=0.9761 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.6997 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.5454 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6436 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.7075 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.4690 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0181, min=0.0000, max=0.4768 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5303 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0246, min=0.0000, max=0.7158 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3735 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.9009 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5020 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.9072 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3225 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.8438 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5283 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.9014 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5771 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0425, min=0.0000, max=0.8970 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.5068 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0511, min=0.0000, max=0.8911 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.2382 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.6055 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.1688 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.6851 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.2484 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.6479 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.1224 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.5703 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1882 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0305, min=0.0000, max=0.4807 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0003, min=0.0000, max=0.0252 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0358, min=0.0000, max=0.5835 | |
Timestep 861.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0011, min=0.0000, max=0.8237 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0236, min=0.0000, max=0.6631 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1520 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.6880 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5732 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.4822 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.7842 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.6592 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0066, min=0.0000, max=0.9956 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.8413 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.5591 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0327, min=0.0000, max=0.9707 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.9395 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.9888 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6558 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.9629 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6318 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0448, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2744 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0463, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2739 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0528, min=0.0000, max=0.8428 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2485 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0520, min=0.0000, max=0.8848 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2448 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0562, min=0.0000, max=0.9619 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0007, min=0.0000, max=0.2029 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0504, min=0.0000, max=0.9253 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.8882 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9668 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4792 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9692 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3574 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0373, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2054 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3330 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2417 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0485, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3340 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3083 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0469, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3789 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4915 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.9204 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7388 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.7622 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.4761 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0243, min=0.0000, max=0.9009 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5571 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.9292 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.6406 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.6758 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.7534 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0401, min=0.0000, max=0.8159 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.4465 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.9312 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4827 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0332, min=0.0000, max=0.7207 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3481 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0377, min=0.0000, max=0.7422 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4285 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.7075 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4153 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0443, min=0.0000, max=0.8735 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0043, min=0.0000, max=0.6724 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0247, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.3489 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.9321 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3340 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.9771 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2190 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.8579 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2355 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0318, min=0.0000, max=0.9854 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2372 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.8613 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3682 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0272, min=0.0000, max=0.7178 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2898 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.9565 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3257 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.8857 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3455 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0536, min=0.0000, max=0.9629 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.3054 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.8931 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3232 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.8638 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.1387 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.9648 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.1672 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.9180 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.1688 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0306, min=0.0000, max=0.9189 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.2964 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0354, min=0.0000, max=0.9673 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4456 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.8999 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.4934 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.9839 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6377 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.9590 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.4275 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0540, min=0.0000, max=0.9771 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.6421 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.5469 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6206 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0198, min=0.0000, max=0.7285 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4404 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0183, min=0.0000, max=0.4556 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.5234 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0251, min=0.0000, max=0.7061 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3455 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.9062 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.4868 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.9082 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3169 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.8467 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5015 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0354, min=0.0000, max=0.9014 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5679 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0428, min=0.0000, max=0.8979 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.4612 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0513, min=0.0000, max=0.8979 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.2683 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.5811 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.1027 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.6714 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.1884 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.6143 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.0995 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.5693 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1494 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.4734 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0003, min=0.0000, max=0.0199 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.5796 | |
Timestep 881.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.8188 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0237, min=0.0000, max=0.6626 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1411 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.6772 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.5659 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.4775 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.7920 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.6587 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0065, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.8511 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.5825 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.9746 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.9224 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.9902 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6094 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.9653 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5791 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2494 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0463, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2773 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0528, min=0.0000, max=0.8438 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2661 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0521, min=0.0000, max=0.8838 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2336 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0563, min=0.0000, max=0.9619 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0007, min=0.0000, max=0.1952 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.9233 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8892 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4580 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9648 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3164 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0377, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2559 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0422, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3286 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2230 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0499, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3379 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3518 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0474, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2815 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0497, min=0.0000, max=0.9927 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4922 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9238 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0034, min=0.0000, max=0.7964 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.7612 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4468 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0249, min=0.0000, max=0.9131 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5288 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0358, min=0.0000, max=0.9331 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5991 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.6548 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.7305 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0405, min=0.0000, max=0.8159 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.4688 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0422, min=0.0000, max=0.9287 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5039 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0335, min=0.0000, max=0.7251 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3513 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.7393 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4448 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0415, min=0.0000, max=0.7065 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3840 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.8799 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0045, min=0.0000, max=0.7080 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.8628 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0030, min=0.0000, max=0.2983 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0286, min=0.0000, max=0.9414 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.2903 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.9785 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3193 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.8887 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2362 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0328, min=0.0000, max=0.9878 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2332 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.8550 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3181 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.7114 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2693 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0393, min=0.0000, max=0.9551 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3145 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.8955 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3508 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0539, min=0.0000, max=0.9619 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2979 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.9106 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2173 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.8765 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.1237 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.9731 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.1683 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.9199 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.1578 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9136 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2798 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0366, min=0.0000, max=0.9639 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.4128 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.9058 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.4749 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0433, min=0.0000, max=0.9834 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.6182 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0502, min=0.0000, max=0.9575 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.4067 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0538, min=0.0000, max=0.9751 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0032, min=0.0000, max=0.6514 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0272, min=0.0000, max=0.5483 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.5942 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0204, min=0.0000, max=0.7651 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.3999 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0188, min=0.0000, max=0.4514 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.5190 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0262, min=0.0000, max=0.7173 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3069 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.9146 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.4746 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9062 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3218 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.8501 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5889 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.9014 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5498 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0431, min=0.0000, max=0.9067 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.3987 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0514, min=0.0000, max=0.8984 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.2583 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.5332 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.0782 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0311, min=0.0000, max=0.6650 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.1737 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.5576 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.0954 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.5679 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1334 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.4666 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0002, min=0.0000, max=0.0185 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.5830 | |
Timestep 901.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.8149 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0238, min=0.0000, max=0.6611 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1290 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6709 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.5557 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.4756 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.7788 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.6606 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0064, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0321, min=0.0000, max=0.8594 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.6040 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.9775 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.9048 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0390, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5435 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.9639 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.5264 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.9937 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.2101 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0462, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3169 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0527, min=0.0000, max=0.8433 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2795 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0521, min=0.0000, max=0.8818 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2233 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0562, min=0.0000, max=0.9609 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0007, min=0.0000, max=0.1986 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9224 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.8921 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.9585 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4346 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9561 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2856 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2585 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0425, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3171 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0452, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2595 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0506, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3279 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0416, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3250 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0476, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2759 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0499, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4849 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0501, min=0.0000, max=0.9268 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.8677 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.7642 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.4368 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0252, min=0.0000, max=0.9180 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.5039 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0361, min=0.0000, max=0.9331 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0028, min=0.0000, max=0.5444 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.6592 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.7095 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0408, min=0.0000, max=0.8188 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4736 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0425, min=0.0000, max=0.9351 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5225 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.7266 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3535 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.7500 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3792 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0416, min=0.0000, max=0.7051 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3535 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0453, min=0.0000, max=0.8882 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0046, min=0.0000, max=0.7725 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0257, min=0.0000, max=0.8433 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.2866 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0293, min=0.0000, max=0.9502 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2998 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.9771 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3684 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.8960 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2152 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0333, min=0.0000, max=0.9907 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2294 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0266, min=0.0000, max=0.8613 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2791 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.7036 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2551 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0399, min=0.0000, max=0.9482 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2983 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.8945 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3555 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0541, min=0.0000, max=0.9619 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2505 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0312, min=0.0000, max=0.9155 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2202 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.8979 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.1271 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.9780 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.1725 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0375, min=0.0000, max=0.9316 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1450 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.9248 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2688 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.9639 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3745 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.9028 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.4529 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.9824 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5991 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9580 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.4211 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0536, min=0.0000, max=0.9731 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6436 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0273, min=0.0000, max=0.5474 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5532 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0209, min=0.0000, max=0.7700 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3662 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0192, min=0.0000, max=0.4590 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.5137 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.7427 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3262 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9229 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4609 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.9121 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3179 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0377, min=0.0000, max=0.8525 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6147 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0365, min=0.0000, max=0.8989 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.5127 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.9033 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.3652 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0515, min=0.0000, max=0.8984 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.2566 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.4832 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0753 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0310, min=0.0000, max=0.6533 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1825 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0288, min=0.0000, max=0.5015 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0887 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.5659 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1232 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.4597 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0002, min=0.0000, max=0.0165 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0358, min=0.0000, max=0.5854 | |
Timestep 921.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0010, min=0.0000, max=0.8027 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0240, min=0.0000, max=0.6621 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1186 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6729 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.5415 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.4802 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.7407 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6650 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0063, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0322, min=0.0000, max=0.8652 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.6343 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.9785 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.8818 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0391, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4717 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0360, min=0.0000, max=0.9580 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.4666 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0448, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.1888 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0461, min=0.0000, max=0.9907 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3086 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0526, min=0.0000, max=0.8408 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2839 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0520, min=0.0000, max=0.8794 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2090 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0562, min=0.0000, max=0.9604 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0007, min=0.0000, max=0.2017 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.9224 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.9009 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0271, min=0.0000, max=0.9604 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4172 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9502 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2612 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2532 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0427, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3098 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0451, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.1949 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0512, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3242 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0419, min=0.0000, max=0.9976 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3296 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0478, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3066 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.4780 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9268 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9155 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0342, min=0.0000, max=0.7700 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.4634 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0254, min=0.0000, max=0.9185 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.4878 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.9360 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0027, min=0.0000, max=0.5264 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0337, min=0.0000, max=0.6699 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6855 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0410, min=0.0000, max=0.8306 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4736 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0428, min=0.0000, max=0.9419 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.5352 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.7310 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3594 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.7529 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3979 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.7041 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3232 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0456, min=0.0000, max=0.8936 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0047, min=0.0000, max=0.8267 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.8252 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.3347 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0299, min=0.0000, max=0.9541 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.2722 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.9697 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3865 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.8975 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2438 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.9922 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2299 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.8696 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2465 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.7085 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2416 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.9414 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2883 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.8911 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3647 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0543, min=0.0000, max=0.9648 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.2469 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.9180 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2181 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.9092 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.1322 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.9795 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1566 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9395 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.1354 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0319, min=0.0000, max=0.9370 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.2737 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0374, min=0.0000, max=0.9619 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3347 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0347, min=0.0000, max=0.8989 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.4253 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.9810 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5737 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0499, min=0.0000, max=0.9561 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.4348 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0534, min=0.0000, max=0.9692 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6299 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.5430 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5063 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0212, min=0.0000, max=0.7603 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3420 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0195, min=0.0000, max=0.4841 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.5044 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.7539 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3489 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.9238 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4404 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.9082 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3271 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.6172 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0368, min=0.0000, max=0.8950 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4402 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.8911 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.3254 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0516, min=0.0000, max=0.8994 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.2539 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0278, min=0.0000, max=0.4338 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0761 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.6353 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1847 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0285, min=0.0000, max=0.5059 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0821 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.5625 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0003, min=0.0000, max=0.1169 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.4539 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0002, min=0.0000, max=0.0145 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0356, min=0.0000, max=0.5884 | |
Timestep 941.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.7930 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0242, min=0.0000, max=0.6611 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1151 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6782 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.5249 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.4756 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.6924 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.6685 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0062, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0324, min=0.0000, max=0.8711 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.6611 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0331, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0024, min=0.0000, max=0.8584 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0392, min=0.0000, max=0.9941 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3853 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0359, min=0.0000, max=0.9502 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.4001 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0447, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.1769 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0460, min=0.0000, max=0.9883 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3123 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0524, min=0.0000, max=0.8394 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2917 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0520, min=0.0000, max=0.8774 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1960 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0561, min=0.0000, max=0.9595 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0006, min=0.0000, max=0.1920 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0493, min=0.0000, max=0.9214 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0037, min=0.0000, max=0.9033 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0269, min=0.0000, max=0.9648 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3945 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9556 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.2350 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2250 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0429, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3118 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0450, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.1882 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0517, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3145 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0423, min=0.0000, max=0.9980 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3369 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0479, min=0.0000, max=0.9966 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3606 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0501, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4731 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9253 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9468 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0343, min=0.0000, max=0.7720 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5029 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0256, min=0.0000, max=0.9243 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4832 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.9399 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.5645 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.6777 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.6597 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0411, min=0.0000, max=0.8364 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.4600 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.9478 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.5586 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.7349 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3901 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.7495 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.4246 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.7080 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2896 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0457, min=0.0000, max=0.8960 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0049, min=0.0000, max=0.8779 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0260, min=0.0000, max=0.7993 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4097 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.9556 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2554 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0302, min=0.0000, max=0.9492 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3398 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.9106 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2473 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.9937 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2301 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0268, min=0.0000, max=0.8711 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2181 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.7100 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2296 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0404, min=0.0000, max=0.9512 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2786 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3696 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0544, min=0.0000, max=0.9658 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2515 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0316, min=0.0000, max=0.9229 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.2197 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.9155 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.1360 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0369, min=0.0000, max=0.9800 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1493 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0389, min=0.0000, max=0.9365 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.1316 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.9458 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.2849 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0376, min=0.0000, max=0.9595 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3010 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.8940 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.3977 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0434, min=0.0000, max=0.9800 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5586 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.9526 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.4646 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0531, min=0.0000, max=0.9658 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6221 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.5391 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4919 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0216, min=0.0000, max=0.7544 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3491 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0199, min=0.0000, max=0.4985 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.4937 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.7651 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3735 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0336, min=0.0000, max=0.9238 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4292 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0334, min=0.0000, max=0.9004 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3259 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0384, min=0.0000, max=0.8535 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.6221 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0370, min=0.0000, max=0.8887 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4473 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.8794 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2932 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0516, min=0.0000, max=0.8999 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.2942 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.4028 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0861 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.6128 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.1887 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0284, min=0.0000, max=0.5127 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0005, min=0.0000, max=0.0753 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0290, min=0.0000, max=0.5581 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0003, min=0.0000, max=0.1118 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0300, min=0.0000, max=0.4487 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0002, min=0.0000, max=0.0126 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0354, min=0.0000, max=0.5903 | |
Timestep 961.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.7847 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0244, min=0.0000, max=0.6582 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1107 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.6812 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0007, min=0.0000, max=0.4963 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0294, min=0.0000, max=0.4783 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.6089 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.6733 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0061, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0325, min=0.0000, max=0.8735 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.6865 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0330, min=0.0000, max=0.9790 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.8232 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0393, min=0.0000, max=0.9946 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3640 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0357, min=0.0000, max=0.9380 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3264 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0445, min=0.0000, max=0.9927 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.1895 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0459, min=0.0000, max=0.9849 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3259 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0522, min=0.0000, max=0.8364 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3018 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0518, min=0.0000, max=0.8735 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1854 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0559, min=0.0000, max=0.9580 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0006, min=0.0000, max=0.1694 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0489, min=0.0000, max=0.9194 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.9077 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.9668 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3716 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0280, min=0.0000, max=0.9614 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2156 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2224 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.3105 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0448, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2086 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0521, min=0.0000, max=1.0000 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3093 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0427, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3391 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0480, min=0.0000, max=0.9971 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.3574 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0500, min=0.0000, max=0.9932 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.4673 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0498, min=0.0000, max=0.9199 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0035, min=0.0000, max=0.9575 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.7671 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5620 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0258, min=0.0000, max=0.9336 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0022, min=0.0000, max=0.4871 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0363, min=0.0000, max=0.9414 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.6558 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.6865 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.6211 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.8394 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.4407 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9536 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5732 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0339, min=0.0000, max=0.7373 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4587 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0380, min=0.0000, max=0.7402 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3247 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0417, min=0.0000, max=0.7148 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2502 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0458, min=0.0000, max=0.8955 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0050, min=0.0000, max=0.9043 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0263, min=0.0000, max=0.7637 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.4087 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0309, min=0.0000, max=0.9546 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2498 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0303, min=0.0000, max=0.9517 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3269 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.9282 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2399 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0340, min=0.0000, max=0.9941 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2374 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0267, min=0.0000, max=0.8843 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.1974 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0282, min=0.0000, max=0.7051 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2207 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0403, min=0.0000, max=0.9575 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2661 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0412, min=0.0000, max=0.9058 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3855 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0544, min=0.0000, max=0.9683 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2485 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0317, min=0.0000, max=0.9248 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.2262 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.9175 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.1393 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0374, min=0.0000, max=0.9814 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.1541 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.9385 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.1245 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0323, min=0.0000, max=0.9512 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0009, min=0.0000, max=0.2983 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0379, min=0.0000, max=0.9590 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2815 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.8984 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.3613 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9795 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5400 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0492, min=0.0000, max=0.9502 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5049 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0528, min=0.0000, max=0.9648 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0031, min=0.0000, max=0.6455 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0276, min=0.0000, max=0.5337 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.7095 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0220, min=0.0000, max=0.7471 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3337 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0205, min=0.0000, max=0.4968 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.4844 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0289, min=0.0000, max=0.7852 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3770 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.9238 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4729 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3127 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0385, min=0.0000, max=0.8564 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.6006 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0371, min=0.0000, max=0.8862 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.3779 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.8682 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2571 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0516, min=0.0000, max=0.8994 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0008, min=0.0000, max=0.3350 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0275, min=0.0000, max=0.3796 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.0853 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0308, min=0.0000, max=0.6016 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1899 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.5229 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.0678 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0291, min=0.0000, max=0.5474 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0003, min=0.0000, max=0.1034 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0298, min=0.0000, max=0.4458 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0002, min=0.0000, max=0.0107 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0352, min=0.0000, max=0.5952 | |
Timestep 981.0: 140 attention activations recorded. | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.7705 | |
Module down_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0245, min=0.0000, max=0.6479 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0003, min=0.0000, max=0.1073 | |
Module down_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0344, min=0.0000, max=0.6865 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0006, min=0.0000, max=0.4692 | |
Module down_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0296, min=0.0000, max=0.4875 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.5552 | |
Module down_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.6792 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0060, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.8745 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.7168 | |
Module down_blocks.2.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0329, min=0.0000, max=0.9795 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7725 | |
Module down_blocks.2.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.9951 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3342 | |
Module down_blocks.2.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0354, min=0.0000, max=0.9180 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.2505 | |
Module down_blocks.2.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0442, min=0.0000, max=0.9927 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1892 | |
Module down_blocks.2.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0457, min=0.0000, max=0.9805 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3267 | |
Module down_blocks.2.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0521, min=0.0000, max=0.8345 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.3171 | |
Module down_blocks.2.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0516, min=0.0000, max=0.8687 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.2319 | |
Module down_blocks.2.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0557, min=0.0000, max=0.9580 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0006, min=0.0000, max=0.1476 | |
Module down_blocks.2.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0485, min=0.0000, max=0.9209 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.9131 | |
Module down_blocks.2.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0265, min=0.0000, max=0.9707 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3479 | |
Module down_blocks.2.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0281, min=0.0000, max=0.9697 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.2086 | |
Module down_blocks.2.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0381, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2023 | |
Module down_blocks.2.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9995 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.2974 | |
Module down_blocks.2.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0446, min=0.0000, max=0.9990 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.2191 | |
Module down_blocks.2.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0527, min=0.0000, max=1.0000 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3064 | |
Module down_blocks.2.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0432, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3411 | |
Module down_blocks.2.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0480, min=0.0000, max=0.9985 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.2261 | |
Module down_blocks.2.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0497, min=0.0000, max=0.9922 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0038, min=0.0000, max=0.4727 | |
Module down_blocks.2.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0496, min=0.0000, max=0.9194 | |
Module mid_block.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0036, min=0.0000, max=0.9658 | |
Module mid_block.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0346, min=0.0000, max=0.7603 | |
Module mid_block.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0023, min=0.0000, max=0.6353 | |
Module mid_block.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0259, min=0.0000, max=0.9458 | |
Module mid_block.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.5059 | |
Module mid_block.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0362, min=0.0000, max=0.9365 | |
Module mid_block.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0026, min=0.0000, max=0.6338 | |
Module mid_block.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.7046 | |
Module mid_block.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.5732 | |
Module mid_block.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0413, min=0.0000, max=0.8364 | |
Module mid_block.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0019, min=0.0000, max=0.3960 | |
Module mid_block.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0433, min=0.0000, max=0.9570 | |
Module mid_block.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.6138 | |
Module mid_block.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0338, min=0.0000, max=0.7319 | |
Module mid_block.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5693 | |
Module mid_block.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.7339 | |
Module mid_block.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3264 | |
Module mid_block.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0416, min=0.0000, max=0.7178 | |
Module mid_block.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.3218 | |
Module mid_block.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0457, min=0.0000, max=0.8901 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0052, min=0.0000, max=0.8877 | |
Module up_blocks.0.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0270, min=0.0000, max=0.7129 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0029, min=0.0000, max=0.5591 | |
Module up_blocks.0.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0315, min=0.0000, max=0.9429 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2683 | |
Module up_blocks.0.attentions.0.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0304, min=0.0000, max=0.9507 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.3450 | |
Module up_blocks.0.attentions.0.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0349, min=0.0000, max=0.9492 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2722 | |
Module up_blocks.0.attentions.0.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0341, min=0.0000, max=0.9932 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2238 | |
Module up_blocks.0.attentions.0.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0264, min=0.0000, max=0.8921 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0013, min=0.0000, max=0.2478 | |
Module up_blocks.0.attentions.0.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.7173 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2299 | |
Module up_blocks.0.attentions.0.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0395, min=0.0000, max=0.9565 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.2554 | |
Module up_blocks.0.attentions.0.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0409, min=0.0000, max=0.9126 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0014, min=0.0000, max=0.4258 | |
Module up_blocks.0.attentions.0.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0544, min=0.0000, max=0.9678 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0021, min=0.0000, max=0.7334 | |
Module up_blocks.0.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0320, min=0.0000, max=0.9307 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.2537 | |
Module up_blocks.0.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.9150 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1332 | |
Module up_blocks.0.attentions.1.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0378, min=0.0000, max=0.9839 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0008, min=0.0000, max=0.1135 | |
Module up_blocks.0.attentions.1.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0401, min=0.0000, max=0.9414 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.1348 | |
Module up_blocks.0.attentions.1.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0326, min=0.0000, max=0.9526 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.3008 | |
Module up_blocks.0.attentions.1.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0383, min=0.0000, max=0.9595 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.2751 | |
Module up_blocks.0.attentions.1.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0355, min=0.0000, max=0.9121 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0011, min=0.0000, max=0.3240 | |
Module up_blocks.0.attentions.1.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0430, min=0.0000, max=0.9795 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0025, min=0.0000, max=0.5034 | |
Module up_blocks.0.attentions.1.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0486, min=0.0000, max=0.9478 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.5317 | |
Module up_blocks.0.attentions.1.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0525, min=0.0000, max=0.9644 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0010, std=0.0033, min=0.0000, max=0.9385 | |
Module up_blocks.0.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0277, min=0.0000, max=0.5342 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0010, std=0.0020, min=0.0000, max=0.9639 | |
Module up_blocks.0.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0228, min=0.0000, max=0.7666 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.2798 | |
Module up_blocks.0.attentions.2.transformer_blocks.2.attn2 (cross attention): mean=0.0130, std=0.0214, min=0.0000, max=0.5132 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn1 (self attention): mean=0.0010, std=0.0012, min=0.0000, max=0.4836 | |
Module up_blocks.0.attentions.2.transformer_blocks.3.attn2 (cross attention): mean=0.0130, std=0.0301, min=0.0000, max=0.8037 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.3154 | |
Module up_blocks.0.attentions.2.transformer_blocks.4.attn2 (cross attention): mean=0.0130, std=0.0348, min=0.0000, max=0.9268 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn1 (self attention): mean=0.0010, std=0.0017, min=0.0000, max=0.4167 | |
Module up_blocks.0.attentions.2.transformer_blocks.5.attn2 (cross attention): mean=0.0130, std=0.0345, min=0.0000, max=0.8989 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn1 (self attention): mean=0.0010, std=0.0015, min=0.0000, max=0.3438 | |
Module up_blocks.0.attentions.2.transformer_blocks.6.attn2 (cross attention): mean=0.0130, std=0.0388, min=0.0000, max=0.8608 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn1 (self attention): mean=0.0010, std=0.0016, min=0.0000, max=0.6079 | |
Module up_blocks.0.attentions.2.transformer_blocks.7.attn2 (cross attention): mean=0.0130, std=0.0372, min=0.0000, max=0.8823 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn1 (self attention): mean=0.0010, std=0.0018, min=0.0000, max=0.3447 | |
Module up_blocks.0.attentions.2.transformer_blocks.8.attn2 (cross attention): mean=0.0130, std=0.0435, min=0.0000, max=0.8545 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn1 (self attention): mean=0.0010, std=0.0010, min=0.0000, max=0.3169 | |
Module up_blocks.0.attentions.2.transformer_blocks.9.attn2 (cross attention): mean=0.0130, std=0.0516, min=0.0000, max=0.8984 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0009, min=0.0000, max=0.9165 | |
Module up_blocks.1.attentions.0.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0274, min=0.0000, max=0.3601 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.1247 | |
Module up_blocks.1.attentions.0.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0307, min=0.0000, max=0.5981 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.2666 | |
Module up_blocks.1.attentions.1.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0283, min=0.0000, max=0.5396 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0004, min=0.0000, max=0.0909 | |
Module up_blocks.1.attentions.1.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0292, min=0.0000, max=0.5278 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn1 (self attention): mean=0.0002, std=0.0002, min=0.0000, max=0.0892 | |
Module up_blocks.1.attentions.2.transformer_blocks.0.attn2 (cross attention): mean=0.0130, std=0.0297, min=0.0000, max=0.4431 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn1 (self attention): mean=0.0002, std=0.0002, min=0.0000, max=0.0092 | |
Module up_blocks.1.attentions.2.transformer_blocks.1.attn2 (cross attention): mean=0.0130, std=0.0350, min=0.0000, max=0.5942 | |
Process finished with exit code 0 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment