Skip to content

Instantly share code, notes, and snippets.

@kjw0612
kjw0612 / gist:dfec9e3c9ed684a5c283
Created December 15, 2015 05:34
Convert PDF to image
convert -density 300 \[TensorFlow\]\ Sequence-to-Sequence\ Models.pdf TensorFlow/output.png
@kjw0612
kjw0612 / compress_pdf.md
Created November 12, 2015 14:07
Compress PDF

gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/screen -dNOPAUSE -dQUIET -dBATCH -sOutputFile=output.pdf input.pdf

@kjw0612
kjw0612 / invert_alexnet_conv5_deploy.prototxt
Last active September 27, 2015 05:25
Inverting Alexnet. Paper: Inverting Convolutional Networks with Convolutional Networks
name: "CaffeNet"
layers {
name: "data"
type: DATA
top: "data"
data_param {
source: "/misc/lmbraid10/dosovits/Datasets/ILSVRC2012/all/val_leveldb"
backend: LEVELDB
batch_size: 16
crop_size: 227
@kjw0612
kjw0612 / 128x128_train.prototxt
Created September 27, 2015 05:06
Learning to generate chairs proto
name: "CaffeNet"
layers {
name: "data"
type: DATA
top: "data"
top: "label"
data_param {
source: "@YOUR_PATH_TO_DATA@/chairs_128x128_reduced/data-lmdb"
batch_size: 64
scale: 0.00390625
@kjw0612
kjw0612 / incremental_history_search.txt
Created September 26, 2015 15:36
An extremely handy tool :: Incremental history searching
An extremely handy tool :: Incremental history searching
In terminal enter:
gedit ~/.inputrc
Then copy paste and save:
"\e[A": history-search-backward
"\e[B": history-search-forward
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
git remote add -t dag matconvnet https://github.com/vlfeat/matconvnet
git pull matconvnet dag

git push --recurse-submodules=on-demand

@kjw0612
kjw0612 / matconvnet_compile.txt
Created September 9, 2015 07:41
Matconvnet Compile
addpath matlab
vl_compilenn('enableGPU', 1, 'cudaRoot', '/usr/local/cuda', 'cudaMethod', 'nvcc', 'enableCudnn', 1, 'cudnnRoot', 'local/');
@kjw0612
kjw0612 / neural11lines.py
Last active November 17, 2015 19:28
A Neural Network in 11 lines of Python
X = np.array([ [0,0,1],[0,1,1],[1,0,1],[1,1,1] ])
y = np.array([[0,1,1,0]]).T
syn0 = 2*np.random.random((3,4)) - 1
syn1 = 2*np.random.random((4,1)) - 1
for j in xrange(60000):
l1 = 1/(1+np.exp(-(np.dot(X,syn0))))
l2 = 1/(1+np.exp(-(np.dot(l1,syn1))))
l2_delta = (y - l2)*(l2*(1-l2))
l1_delta = l2_delta.dot(syn1.T) * (l1 * (1-l1))
syn1 += l1.T.dot(l2_delta)