This article is now published on my website: Prefer Subshells for Context.
#!/usr/bin/env python | |
""" | |
The testkernel command does these things to make linux kernel testing easy: | |
looks in the indicated folder on the web | |
downloads the .deb files for the kernel of the given type | |
installs them locally | |
configures grub2 to reboot to the given kernel on the next reboot | |
TODO: |
This is pretty much what I had intended to work on. Specifically I'd like to get the docker images that package up spark in the ipython/scipyserver docker image (https://github.com/rdhyee/ipython-spark/blob/master/Dockerfile) to run on a Mesos cluster. Of relevance is a Spark PR that seems almost ready to go: apache/spark#3074 (combined with https://issues.apache.org/jira/browse/SPARK-2691). If we get the basics working, I'd love to work such integration with https://github.com/rgbkrk/cloudpipe, which I understand to be a hybrid of http://www.multyvac.com/ (I loved its predecessor: picloud) + tmpnb (https://lambdaops.com/ipythonjupyter-tmpnb-debuts/). Wonderful to be able to let people spin up a temporary Jupyter notebook that can run Spark and also be based on an arbitrary docker image.
#!/usr/bin/env python3 | |
# -*- coding: utf-8 -*- | |
# Modified. | |
# Original script source: | |
# http://blog.marcbelmont.com/2012/10/script-to-extract-email-attachments.html | |
# https://web.archive.org/web/20150312172727/http://blog.marcbelmont.com/2012/10/script-to-extract-email-attachments.html | |
# Usage: | |
# Run the script from a folder with file "all.mbox" |
""" Trains an agent with (stochastic) Policy Gradients on Pong. Uses OpenAI Gym. """ | |
import numpy as np | |
import cPickle as pickle | |
import gym | |
# hyperparameters | |
H = 200 # number of hidden layer neurons | |
batch_size = 10 # every how many episodes to do a param update? | |
learning_rate = 1e-4 | |
gamma = 0.99 # discount factor for reward |