Skip to content

Instantly share code, notes, and snippets.

View dustinvtran's full-sized avatar

Dustin Tran dustinvtran

View GitHub Profile
@dustinvtran
dustinvtran / main.py
Created December 14, 2020 08:02
Python version of source code for https://mcspeedrun.com/dream.pdf. This implementation uses default precision (float64), so the decimal values are slightly off from the original Java implementation which uses BigDecimal.
import numpy as np
import scipy.stats
from typing import List
def shifty_investigator(num_trials: int,
num_successes: int,
p_success: float):
p_fail = 1. - p_success
target_p = 1. - scipy.stats.binom.cdf(n=num_trials,
TRACE_STACK = [lambda f, *args, **kwargs: f(*args, **kwargs)]
@contextmanager
def trace(tracer):
TRACE_STACK.append(tracer)
yield
TRACE_STACK.pop()
def traceable(func):
def func_wrapped(*args, **kwargs):
@dustinvtran
dustinvtran / toposort.py
Created February 4, 2018 22:34
Building off autograd's reverse toposort.
import operator
def toposort(start_node, children=operator.attrgetter('children')):
"""Generate nodes in DAG's topological order. This guarantees
for any edge U -> V, we always visit U before visiting V.
This lets us play the tape via the "push" dataflow model.
https://stackoverflow.com/questions/981027/what-are-pro-cons-of-push-pull-data-flow-models
"""
parent_counts = {}
@dustinvtran
dustinvtran / namespace_hack.py
Last active December 27, 2017 10:24
Context managers as namespaces. From "Don't Do This" by Richard Jones (https://www.youtube.com/watch?v=H2yfXnUb1S4)
import inspect
class LocalsCapture(object):
def __enter__(self):
caller_frame = inspect.currentframe().f_back
self.local_names = set(caller_frame.f_locals)
return self
def __exit__(self, exc_type, exc_val, exc_tb):
caller_frame = inspect.currentframe().f_back
@dustinvtran
dustinvtran / PPC_Plots.ipynb
Created July 1, 2017 23:21 — forked from lbollar/PPC_Plots.ipynb
Test Notebook to demo plot behavior
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@dustinvtran
dustinvtran / truncated_dp_mixture.py
Created February 18, 2017 02:49
In-progress of truncated DP mixture model
#!/usr/bin/env python
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import edward as ed
import matplotlib.cm as cm
import numpy as np
import seaborn as sns
import tensorflow as tf
@dustinvtran
dustinvtran / concrete.py
Last active January 17, 2017 00:35
A template for implementing custom random variables in Edward.
from edward.models import RandomVariable
from tensorflow.contrib.distributions import Distribution
class Concrete(RandomVariable, Distribution):
def __init__(self, *args, **kwargs):
kwargs['is_continuous'] = True
kwargs['is_reparameterized'] = True
super(Concrete, self).__init__(*args, **kwargs)
def _log_prob(self, value):
@dustinvtran
dustinvtran / tensorflow_api.csv
Created August 13, 2016 00:25
List of all TensorFlow operations
We can make this file beautiful and searchable if this error is corrected: It looks like row 5 should actually have 1 column, instead of 2 in line 4.
# TensorFlow r0.10
#
# Building Graph
#
add_to_collection,tf.add_to_collection
as_dtype,tf.as_dtype
bytes,tf.bytes
container,tf.container
control_dependencies,tf.control_dependencies
convert_to_tensor,tf.convert_to_tensor
@dustinvtran
dustinvtran / gist.py
Last active March 14, 2016 01:56
Assignment (pass by reference) in TensorFlow
# This note is to show that TensorFlow objects are shallow copied:
# a deep copy is not made when a class contains another class.
#
# This is relevant for variational parameters. For example, Inference
# holds Variational, we train variational parameters via
# inference.run(). We can use the original variational object as it will
# have the trained parameters, and not just resort to using
# inference.variational.
# (This is probably a feature in Python in general.)
from __future__ import print_function
@dustinvtran
dustinvtran / lspi_notes.md
Last active August 29, 2015 14:16
Outline for CS282r lecture (03/12/15): LSPI algorithm

Summary Portion

  1. Minjae: Brief overview of the contributions of paper, summary of decomposition of the Q into k basis functions, quick derivation of the weight.
  2. Dustin: LSQ, asymptotic, incremental update, LSPI.

Some things to reinforce:

  • That it's linear with respect to the parameters and not necessarily the data.
  • That it calculates parameters using the least squares approach (in other words, decision theory), rather than doing MLE or MAP or whatever.
  • That its primary advantages are: intuition, simplicity, good theoretical