Skip to content

Instantly share code, notes, and snippets.

predict' :: DecisionTree -> V.Vector Double -> Double
predict' (Leaf value) _ = value
predict' (Branch feature splitValue left right) featureVector =
if featureVector ! feature < splitValue
then predict' left featureVector
else predict' right featureVector
=================================================================================================
Compiled Naive Piecewise Flattened Contiguous Flattened
-------------------------------------------------------------------------------------------------
(Intercept) -11560.208*** -8697.547*** -11962.803*** -18324.924***
(1436.562) (1105.108) (2250.144) (3568.027)
num_trees 54.382*** 44.813*** 44.427*** 64.475***
(6.300) (4.846) (9.868) (15.647)
depth 2479.344*** 1975.124*** 2441.172*** 3651.876***
(233.106) (179.322) (365.123) (578.972)
num_features 0.058*** 0.074*** 0.166*** 0.268***
library("nlme")
library("data.table")
library("memisc")
library("functional")
kColumns <- c("algorithm", "num_trees", "depth", "num_features", "time_ns")
preprocess <- function(filename) {
# Could use read.table, but fails for some reason
df <- data.table(read.csv(filename, col.names=kColumns))
\documentclass[12pt]{amsart}
\usepackage{amsthm, amsmath, amssymb}
\usepackage{setspace}
\usepackage{listings}
\onehalfspacing
\theoremstyle{plain}% default
\newtheorem{thm}{Theorem}[section]
\newtheorem{lem}[thm]{Lemma}
\newtheorem{prop}[thm]{Proposition}
library('ProjectTemplate')
load.project()
## Linear Regression
mod <- lm(Y ~ ., data = zip.train.filtered)
# Round predictions
category_f <- function(x) { if (x > 2.5) 3 else 2 }
predictions.lm.test <- as.character(sapply(predict(mod, zip.test.filtered),
category_f))

Simple Examples

This section introduces the usage of the LaTeX2Markdown tool, showing an example of the various environments available.

Theorem 1 (Euclid, 300 BC)

There are infinitely many primes.

GHCi, version 7.6.3: http://www.haskell.org/ghc/ :? for help
Loading package ghc-prim ... linking ... done.
Loading package integer-gmp ... linking ... done.
Loading package base ... linking ... done.
Prelude> :cd /Users/tulloch/Code/haskell/machinelearning/
Prelude> :load "MachineLearning/HopfieldDemonstration.hs"
[2 of 3] Compiling MachineLearning.Hopfield ( MachineLearning/Hopfield.hs, interpreted )
[3 of 3] Compiling MachineLearning.HopfieldDemonstration ( MachineLearning/HopfieldDemonstration.hs, interpreted )
Ok, modules loaded: MachineLearning.HopfieldDemonstration, MachineLearning.Hopfield, MachineLearning.Util.
*MachineLearning.HopfieldDemonstration> main
import numpy as np
from collections import namedtuple
import logging
logging.basicConfig(level=logging.DEBUG)
class BarrierFunction(object):
def __init__(self, n):
pass
import numpy as np
from collections import namedtuple
import logging
logging.basicConfig(level=logging.DEBUG)
class BarrierFunction(object):
def __init__(self, n):
pass
import numpy as np
import cvxopt
class Kernel(object):
"""Implements list of kernels from
http://en.wikipedia.org/wiki/Support_vector_machine
"""
@staticmethod
def linear(x, y):