Skip to content

Instantly share code, notes, and snippets.

View farrajota's full-sized avatar

M. Farrajota farrajota

View GitHub Profile
@farrajota
farrajota / conv_to_linear.lua
Last active July 26, 2017 10:33
Fully connected to convolution layer
require 'nn'
-- you just need to provide the linear module you want to convert,
-- and the dimensions of the field of view of the linear layer
function convertLinear2Conv1x1(linmodule,in_size)
--[[
Convert Linear modules to convolution modules.
Arguments
@farrajota
farrajota / gist:e5ebe0726535c9049a9e
Created January 27, 2016 23:41
how to disable learning on a layer
m.updateGradInput = function(self,i,o) end -- for the gradInput
m.accGradParameters = function(self,i,o) end -- for freezing the parameters
@farrajota
farrajota / gist:23cadab5eb13c3ef296b
Created February 9, 2016 00:02
split strings by any type of separator in Lua
local function mysplit(inputstr, sep)
-- split strings by any type of separator
if sep == nil then
sep = "%s"
end
local t={} ; i=1
for str in string.gmatch(inputstr, "([^"..sep.."]+)") do
t[i] = str
i = i + 1
end
@farrajota
farrajota / pretrained_models.lua
Last active August 8, 2021 12:22
Torch7 trained models available
-- Available torch7 pre-trained models for download
Alexnet (trained by me)
cudnn: https://ln.sync.com/dl/3d9c28e80#cvjp4mjs-mc2y8jrt-9wq5yebx-yrdchcei
mean = {0.48037518790839, 0.45039056120456, 0.39922636057037}
std = {0.27660147027775, 0.26883440068399, 0.28014687231841}
img size: 3x224x224
overfeat: https://github.com/jhjin/overfeat-torch
mean = {118.380948, 118.380948, 118.380948}
@farrajota
farrajota / table2file.lua
Last active August 25, 2016 09:15
save lua table contents to file (useful for logging purposes)
function save_configs(filename, opt)
-- prints the contents of a table into a file
local function table_print (tt, indent, done)
done = done or {}
indent = indent or 0
if type(tt) == "table" then
local sb = {}
for key, value in pairs (tt) do
table.insert(sb, string.rep (" ", indent)) -- indent it
@farrajota
farrajota / multiple_learning_rates.lua
Last active April 10, 2018 16:47
Example code for how to set different learning rates per layer. Note that when calling :parameters(), the weights and bias of a given layer are separate, consecutive tensors. Therefore, when calling :parameters(), a network with N layers will output a table with N*2 tensors, where the i'th and i'th+1 tensors belong to the same layer.
-- multiple learning rates per network. Optimizes two copies of a model network and checks if the optimization steps (2) and (3) produce the same weights/parameters.
require 'torch'
require 'nn'
require 'optim'
torch.setdefaulttensortype('torch.FloatTensor')
-- (1) Define a model for this example.
local model = nn.Sequential()
model:add(nn.Linear(10,20))
@farrajota
farrajota / binary_tree_torch.lua
Last active May 1, 2017 14:48
Small example on how to create a binary tree in Torch7 using NN containers and nngraph.
--[[
Create a binary tree using two ways: NN containers and nn.gModule containers.
This example is fairly simple, and the default fully-connected layers are all
of size 100. However, this should also be simple to modify to allow different
fc layers with varying inputs/outputs if desired (for example: input a table
storing input+output configuration values for each of the sub-branch's level).
]]
require 'nn'
require 'nngraph'
@farrajota
farrajota / data_transforms.lua
Last active January 8, 2018 03:25
Commonly used data augmentation techniques for torch7.
require 'image'
local M = {}
function M.Compose(transforms)
return function(input)
for _, transform in ipairs(transforms) do
input = transform(input)
end
return input
@farrajota
farrajota / mpii_mat2json.m
Last active December 27, 2023 13:45
Convert mpii annotations from .mat to .json format
function mpii_convert_json( )
% convert mpii annotations .mat file to .json
%% load annotation file
fprintf('Load annotations... ')
data = load('/media/HDD2/Datasets/Human_Pose/mpii/mpii_human_pose_v1_u12_2/mpii_human_pose_v1_u12_1.mat');
fprintf('Done.\n')
%% open file
fprintf('Open file mpii_human_pose_annotations.json\n')
@farrajota
farrajota / freeze.lua
Created September 6, 2016 14:17
freeze parameters of a layer
model.modules[1].parameters = function() return nil end -- freezes the layer when using optim
model.modules[1].accGradParameters = function() end -- overwrite this to reduce computations