Skip to content

Instantly share code, notes, and snippets.

View reddragon's full-sized avatar
🤖
Too much to do, too little time.

Gaurav Menghani reddragon

🤖
Too much to do, too little time.
View GitHub Profile
@reddragon
reddragon / struct.cpp
Last active September 12, 2017 18:30
Set struct members inline
#include <iostream>
using namespace std;
struct Foo {
int a;
double b;
};
int main() {
const Foo f = {
@reddragon
reddragon / transfer-learning.md
Created April 2, 2018 00:42
Transfer Learning Papers

How transferable are features in deep neural networks? - Yosinski et al.

  • Transfer Learning

    • Train on a base network, try to take that network and tweak it to work for a new target network.
    • Notes from CS231N.
  • Tries to figure out how much information can we transfer between networks trained on different datasets.

  • Quantifies the transferability by layer.

  • Hypothesis:

  • First few layers are general (Gabor Filters kind of features) and can adapt well.

Results on MNIST

Feed Forward model with two hidden layers (300, 60).

l2_lambda Accuracy@1 after 80k iters (Two Runs)
0.00 98.15, 98.04
0.01 98.31, 98.19
0.02 98.19, 98.15
0.04 97.93, 97.92