Created
May 1, 2014 01:36
-
-
Save soumith/c5a7ac73e06aee39e48d to your computer and use it in GitHub Desktop.
gist of using the *CUDA modules
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
features = nn.Sequential() | |
features:add(nn.Transpose({1,4},{1,3},{1,2})) | |
features:add(nn.SpatialConvolutionCUDA(fSize[1], fSize[2], 9, 9, 2, 2)) -- (111 - 9 + 2)/2 = 52 | |
features:add(nn.Threshold(0,1e-6)) | |
features:add(nn.SpatialMaxPoolingCUDA(2,2,2,2)) -- 26 | |
features:add(nn.SpatialConvolutionCUDA(fSize[2], fSize[3], 5, 5)) -- 22 | |
features:add(nn.Threshold(0,1e-6)) | |
features:add(nn.SpatialMaxPoolingCUDA(2,2,2,2)) -- 11 | |
features:add(nn.SpatialConvolutionCUDA(fSize[3], fSize[4], 4, 4)) -- 8 | |
features:add(nn.Threshold(0,1e-6)) | |
features:add(nn.SpatialConvolutionCUDA(fSize[4], fSize[5], 3, 3)) -- 6 | |
features:add(nn.Threshold(0,1e-6)) | |
features:add(nn.SpatialMaxPoolingCUDA(2,2,2,2)) -- 3 | |
features:add(nn.Transpose({4,1},{4,2},{4,3})) | |
features:add(nn.Reshape(featuresOut)) |
oops, gist doesn't have notifications on comments i think, didn't notice it until now.
Yes, ReLU is equivalent to nn.Threshold(0,0)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Why do you use a Threshold module? Are you using it as a ReLU?