Skip to content

Instantly share code, notes, and snippets.

@darkseed
darkseed / schools.stan
Created August 9, 2017 15:02 — forked from strongh/schools.stan
toy example of MCMC using (py)stan and (py)spark
data {
int<lower=0> J; // number of schools
real y[J]; // estimated treatment effects
real<lower=0> sigma[J]; // s.e. of effect estimates
}
parameters {
real mu;
real<lower=0> tau;
real eta[J];
}
@darkseed
darkseed / python-pil-image-sprite.py
Created April 22, 2017 19:49 — forked from gourneau/python-pil-image-sprite.py
Make sprites of images using Python and PIL
#!/usr/bin/python
# This work is licensed under the Creative Commons Attribution 3.0 United
# States License. To view a copy of this license, visit
# http://creativecommons.org/licenses/by/3.0/us/ or send a letter to Creative
# Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.
# from http://oranlooney.com/make-css-sprites-python-image-library/
# Orignial Author Oran Looney <[email protected]>
#!/bin/bash
ROS_DISTRO=${ROS_DISTRO:-kinetic}
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
brew update
brew install cmake
package com.databricks.spark.jira
import scala.io.Source
import org.apache.spark.rdd.RDD
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.sources.{TableScan, BaseRelation, RelationProvider}
@darkseed
darkseed / gist:12f58e684768529b72d7d89f0440ea5e
Created February 7, 2017 09:16 — forked from marmbrus/gist:15e72f7bc22337cf6653
Parallel list files on S3 with Spark
import org.apache.hadoop.fs.{FileSystem, Path}
import org.apache.hadoop.conf.Configuration
case class S3File(path: String, isDir: Boolean, size: Long) {
def children = listFiles(path)
}
def listFiles(path: String): Seq[S3File] = {
val fs = FileSystem.get(new java.net.URI(path), new Configuration())
fs.listStatus(new Path(path)).map(s => S3File(s.getPath.toString, s.isDir, s.getLen))
@darkseed
darkseed / a3c.py
Created January 24, 2017 10:23 — forked from awjuliani/a3c.py
class AC_Network():
def __init__(self,s_size,a_size,scope,trainer):
with tf.variable_scope(scope):
#Input and visual encoding layers
self.inputs = tf.placeholder(shape=[None,s_size],dtype=tf.float32)
self.imageIn = tf.reshape(self.inputs,shape=[-1,84,84,1])
self.conv1 = slim.conv2d(activation_fn=tf.nn.elu,
inputs=self.imageIn,num_outputs=16,
kernel_size=[8,8],stride=[4,4],padding='VALID')
self.conv2 = slim.conv2d(activation_fn=tf.nn.elu,
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

Advanced Functional Programming with Scala - Notes

Copyright © 2017 Fantasyland Institute of Learning. All rights reserved.

1. Mastering Functions

A function is a mapping from one set, called a domain, to another set, called the codomain. A function associates every element in the domain with exactly one element in the codomain. In Scala, both domain and codomain are types.

val square : Int => Int = x => x * x
def sigmoid(x: Double) = 1.0 / (1.0 + math.exp(-x))
def f1measure(TP: Double, TN: Double, FP: Double, FN: Double, alpha: Double = 1) = {
val P = precision(TP, FP)
val R = recall(TP, FN)
(2.0 * P * R) / (P + R)
}
def precision(TP: Double, FP: Double) = TP / (FP + TP)
@darkseed
darkseed / spark_knn_approximation.py
Created November 11, 2016 13:20 — forked from tomron/spark_knn_approximation.py
A naive approximation of k-nn algorithm (k-nearest neighbors) in pyspark. Approximation quality can be controlled by number of repartitions and number of repartition
from __future__ import print_function
import sys
from math import sqrt
import argparse
from collections import defaultdict
from random import randint
from pyspark import SparkContext