Last active
December 24, 2015 20:38
-
-
Save andreaskoepf/6c9cae4a28154ddba219 to your computer and use it in GitHub Desktop.
Shows that forwad and backward operations of SpatialConvolution and SpatialFullConvolution are swapped...
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
require 'nn' | |
x = torch.rand(1,5,5) | |
a = nn.SpatialConvolution(1,1,3,3) | |
a.bias:zero() | |
ay1 =torch.xcorr2(x,a.weight,'V') | |
ay2 = a:forward(x) | |
b = nn.SpatialFullConvolution(1,1,3,3) | |
b.bias:zero() | |
by1 = torch.conv2(x, b.weight, 'F') | |
by2 = b:forward(x) | |
print(ay1) | |
print(ay2) | |
print(by1) | |
print(by2) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
require 'nn' | |
a = nn.SpatialConvolution(7,5,3,3) | |
a.bias:zero() | |
b = nn.SpatialFullConvolution(5,7,3,3) | |
b.bias:zero() | |
b.weight = a.weight | |
test = torch.rand(7, 12, 12) | |
test2 = torch.rand(5, 10, 10) | |
y1 = a:forward(test) | |
y2 = b:backward(torch.zeros(5,10,10), test) | |
z1 = a:backward(torch.zeros(7,12,12), test2) | |
z2 = b:forward(test2) | |
-- prints 0 twice | |
print((y1-y2):abs():sum()) | |
print((z1-z2):abs():sum()) |
Here's a fun question: if the stride is 1, is a SpatialFullConvolution layer equivalent to the SpatialConvolution layer? or are there tricks like transposing the kernel matrix?
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
So a SpatialFullConvolution layer is just doing "deconvolution" / "fractionally-strided convolution" / "upconvolution" / ... ?