Skip to content

Instantly share code, notes, and snippets.

View daerduoCarey's full-sized avatar

Kaichun Mo daerduoCarey

View GitHub Profile
@daerduoCarey
daerduoCarey / binvox_rw.py
Created November 13, 2017 23:30 — forked from chrischoy/binvox_rw.py
ShapeNet Voxelization
# Copyright (C) 2012 Daniel Maturana
# This file is part of binvox-rw-py.
#
# binvox-rw-py is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# binvox-rw-py is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
@daerduoCarey
daerduoCarey / binvox_rw.py
Created November 13, 2017 23:30 — forked from chrischoy/binvox_rw.py
ShapeNet Voxelization
# Copyright (C) 2012 Daniel Maturana
# This file is part of binvox-rw-py.
#
# binvox-rw-py is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# binvox-rw-py is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of

NIPS Notes

Tutorials

Variational Inference

  • Simple Intro by Blei mostly going over review paper of Jordan
  • Later introduce SVI (Stochastic VI) as a remedy to solve VI tractably with large dataset.
  • Review the black box inference -assumption free VI- http://www.jmlr.org/proceedings/papers/v33/ranganath14.pdf
  • Key idea is replacing gradient and the expectation in VI formulation. Since expectation reqiuires exponential family assumption to work replacing expectation and gradient solves this if overall method is stochastic since your samples are unbiased gradient estimates satisfying Robinson-Monroe conditions however the variance is very large and it requires even further tricks