Skip to content

Instantly share code, notes, and snippets.

View andrewschreiber's full-sized avatar
🎯
Focusing

Andrew Schreiber andrewschreiber

🎯
Focusing
View GitHub Profile
# Create a saliency map for each data point
for i, image in enumerate(data):
# Forward pass on image
# Note: the activations are saved on each layer
output = image
for l in range(len(network.layers)):
output = network.layers[l].forward(output)
# Backprop to get gradient
label_one_hot = labels[i]
@andrewschreiber
andrewschreiber / vg_logic.py
Created August 16, 2019 06:29
def save_vanilla_gradient(network, data, labels), see https://github.com/andrewschreiber/numpy-saliency
# Create a saliency map for each data point
for i, image in enumerate(data):
# Forward pass on image
# Note: the activations from this are saved on each layer
output = image
for l in range(len(network.layers)):
output = network.layers[l].forward(output)
# Backprop to get gradient
label_one_hot = labels[i]
# Backprop to get gradient
label_one_hot = labels[i]
dy = np.array(label_one_hot)
for l in range(len(network.layers)-1, -1, -1):
dout = network.layers[l].backward(dy)
dy = dout
@andrewschreiber
andrewschreiber / vg_p1.py
Last active August 16, 2019 06:15
def save_vanilla_gradient(network, data, labels), see https://github.com/andrewschreiber/numpy-saliency
# Create a saliency map for each data point
for i, image in enumerate(data):
# Run a forward pass with an image
output = image
for l in range(len(network.layers)):
output = network.layers[l].forward(output)
from model.data import mnist_train_test_sets
from model.network import LeNet5
from saliency.vanilla_gradient import save_vanilla_gradient
# Get MNIST dataset, preprocessed
train_images, train_labels, test_images, test_labels = mnist_train_test_sets()
# Load net with 98% acc weights
net = LeNet5(weights_path="15epoch_weights.pkl")
# Generate saliency maps for the first 10 images
@andrewschreiber
andrewschreiber / road_sign_classifier.py
Last active August 19, 2017 11:43
Example Hyperdash expanded SDK
# From CLI:
# hyperdash run -n 'mymodel' python mymodel.py
import hyperdash as hd
learning_rate = hd.param('learning rate', default=0.01) # Setup hyperparameters
# Model code here
hd.metric('loss', training_loss) # Record a metric
# Params and metrics are pretty printed at end of experiment
@andrewschreiber
andrewschreiber / jupyter_gym_render.md
Last active December 29, 2021 12:02
How to stream OpenAI Gym environment rendering within a Jupyter Notebook

Open jupyter with

$ xvfb-run -s "-screen 0 1400x900x24" jupyter notebook

In Jupyter

import matplotlib.pyplot as plt
%matplotlib inline

After each step

def show_state(env, step=0):

Keybase proof

I hereby claim:

  • I am andrewschreiber on github.
  • I am andrewschreiber (https://keybase.io/andrewschreiber) on keybase.
  • I have a public key whose fingerprint is B124 EC8F E431 5EC0 7D0C 6205 2824 08D7 8326 AB72

To claim this, I am signing this object:

@andrewschreiber
andrewschreiber / mac_gym_installer.sh
Created April 12, 2017 00:04
Installs OpenAI Gym on MacOS -
#!/bin/sh
# See video https://www.youtube.com/watch?v=7PO27i2lEOs
set -e
command_exists () {
type "$1" &> /dev/null ;
}
@andrewschreiber
andrewschreiber / ReactiveCocoa.podspec.json
Created October 7, 2016 23:37
RAC 4.2.2 podspec for Swift 2.3 on Xcode 8.0
{
"name": "ReactiveCocoa",
"version": "4.2.2",
"summary": "A framework for composing and transforming streams of values.",
"description": "ReactiveCocoa (RAC) is an Objective-C framework for Functional Reactive Programming.\nIt provides APIs for composing and transforming streams of values.",
"homepage": "https://github.com/ReactiveCocoa/ReactiveCocoa",
"license": {
"type": "MIT",
"file": "LICENSE.md"
},