This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
creating an lstm with 2 layers | |
setting forget gate biases to 1 in LSTM layer 1 | |
setting forget gate biases to 1 in LSTM layer 2 | |
number of parameters in the model: 240321 | |
cloning rnn | |
cloning criterion | |
/Users/jfsantos/torch/install/bin/luajit: /Users/jfsantos/torch/install/share/lua/5.1/nn/Identity.lua:13: bad argument #1 to 'set' (expecting number or Tensor or Storage at /tmp/luarocks_torch-scm-1-8315/torch7/generic/Tensor.c:1089) | |
stack traceback: | |
[C]: in function 'set' | |
/Users/jfsantos/torch/install/share/lua/5.1/nn/Identity.lua:13: in function 'func' |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
signal = require "signal" | |
complex = require "signal.complex" | |
function istft(X, win, hop) | |
local x = torch.zeros((X:size(1)-1)*hop + win) | |
framesamp = X:size(2) | |
hopsamp = hop | |
for n=1,X:size(1) do | |
i = 1 + (n-1)*hopsamp | |
print(i, i + framesamp - 1) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
model = nn.Sequential() | |
lstm = nn.Sequencer( | |
nn.Sequential() | |
:add(nn.LSTM(nFeatures,nHidden)) | |
:add(nn.Dropout()) | |
:add(nn.LSTM(nHidden,nHidden)) | |
) | |
lstm:remember('neither') -- force model to call forget at each call to forward |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from __future__ import print_function | |
from keras.models import Model, Sequential | |
from keras.layers import Input, Dense, TimeDistributed | |
from keras.layers.core import Reshape, Flatten, Dropout, TimeDistributedDense | |
from keras.layers.advanced_activations import LeakyReLU | |
from keras.layers.normalization import BatchNormalization | |
from keras.layers.convolutional import Convolution2D | |
from keras.layers.recurrent import LSTM | |
from keras.optimizers import Adam |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from keras.models import Sequential | |
from keras.layers import Dense | |
from keras.utils.io_utils import HDF5Matrix | |
import numpy as np | |
def create_dataset(): | |
import h5py | |
X = np.random.randn(200,10).astype('float32') | |
y = np.random.randint(0, 2, size=(200,1)) | |
f = h5py.File('test.h5', 'w') |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// Use Gists to store code you would like to remember later on | |
console.log(window); // log the "window" object to the console |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
''' | |
A logistic regression example using the meta-graph checkpointing | |
features of Tensorflow. | |
Author: João Felipe Santos, based on code by Aymeric Damien | |
(https://github.com/aymericdamien/TensorFlow-Examples/) | |
''' | |
from __future__ import print_function |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Agility | |
&{template:default} {{name=@{selected|character_name}}}{{Agility roll=[[1d20 + @{selected|agility_mod} + [[?{# Boons|0} - ?{# Banes|0}]]d6k1]]}} | |
Intellect | |
&{template:default} {{name=@{selected|character_name}}}{{Intellect roll=[[1d20 + @{selected|intellect_mod} + [[?{# Boons|0} - ?{# Banes|0}]]d6k1]]}} | |
Perception |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from torch.utils.data import Dataset | |
class DummyDataset(Dataset): | |
def __init__(self, items): | |
super(DummyDataset, self).__init__() | |
self.items = items | |
def __getitem__(self, index): | |
return self.items[index] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def train_fn(model, optimizer, criterion, batch): | |
x, y, lengths = batch | |
x = Variable(x.cuda()) | |
y = Variable(y.cuda(), requires_grad=False) | |
mask = Variable(torch.ByteTensor(x.size()).fill_(1).cuda(), | |
requires_grad=False) | |
for k, l in enumerate(lengths): | |
mask[:l, k, :] = 0 |