I hereby claim:
- I am Dawny33 on github.
- I am dawny33 (https://keybase.io/dawny33) on keybase.
- I have a public key whose fingerprint is DD01 24DB 0983 7AAC C6FE 1427 0A5B F308 6A57 CBBA
To claim this, I am signing this object:
| This gist is a simple explanation about how to connect and access your Jupyter notebooks on the P2 instance, from your local browser. | |
| I had some problems with directly accessing port 8888 of my server for Jupyter. However, the following steps helped in sorting that out. | |
| The following steps assume that you already have a P2 instance running with the `DeepLearning Ubuntu` AMI | |
| - Jupyter would already be installed in your server if you are running the DeepLearning AMI | |
| - Start Jupyter with `jupyter notebook` |
I hereby claim:
To claim this, I am signing this object:
| # Implementation of a simple MLP network with one hidden layer. Tested on the iris data set. | |
| # Requires: numpy, sklearn>=0.18.1, tensorflow>=1.0 | |
| # NOTE: In order to make the code simple, we rewrite x * W_1 + b_1 = x' * W_1' | |
| # where x' = [x | 1] and W_1' is the matrix W_1 appended with a new row with elements b_1's. | |
| # Similarly, for h * W_2 + b_2 | |
| import tensorflow as tf | |
| import numpy as np | |
| from sklearn import datasets | |
| from sklearn.model_selection import train_test_split |
| import paramiko | |
| k = paramiko.RSAKey.from_private_key_file("YOUR_PEM_FILE.pem") | |
| c = paramiko.SSHClient() | |
| c.set_missing_host_key_policy(paramiko.AutoAddPolicy()) | |
| c.connect( hostname = "ec2-1-1-1-1.us-west-2.compute.amazonaws.com", username = "ec2-user", pkey = k ) | |
| stdin , stdout, stderr = c.exec_command("hostname") | |
| print("stdout: " + stdout.read()) | |
| print("stderr" + stderr.read()) |
| #-*- coding:utf-8 - *- | |
| def load_dataset(): | |
| "Load the sample dataset." | |
| return [[1, 3, 4], [2, 3, 5], [1, 2, 3, 5], [2, 5]] | |
| def createC1(dataset): | |
| "Create a list of candidate item sets of size one." |
A list of bite-sized projects:
A to-do app
Automated ML pipeline with Luigi/Airflow
A fully-automatic client-server framework (using deep neural nets) to identify lost domestic animals based only on images/videos uploaded from the owner and a short video uploaded from the finder!
Create a service which takes string data and returns capitalized version of that data
Convert Natural Language to SQL queries