-
-
Save bwengals/c3396c2d5e3127ec4a9969fd69ec5a3f to your computer and use it in GitHub Desktop.
kind of, the two implementations are mathematically equivalent, but this trick makes it easier to do inference on. GPflow does this under the hood for their HMC routine. They have a nice example of this here.
Implementing the gp directly looks like this:
K = covariance matrix
f ~ N(0, K)
y ~ Poisson(lambda = f)
or it can be reparameterized like this:
L = cholesky(K)
v ~ N(0, I)
f ~ Lv
y ~ Poisson(lambda = f)
It's the same mathematically, since K = LL^T. It's like the procedure for drawing samples from a multivariate normal, N(mu, K), using the Cholesky decomposition, which is:
L = cholesky(K)
v ~ N(0, I)
sample ~ mu + Lv
This trick makes inference easier because you are estimating v, and not f. v will have much fewer tricky covariances than f. And you should already have the Cholesky factor L around, so that's nice too.
Thanks for the explanation! It would be great if this is implemented in PyMC3 as default - as the sampling using MvNormal all seems to be a bit slow generally.
Thanks a lot @bwengals!
So basically instead of using a
pm.MvNormal
here you used a standard normal with the covariance function to represent the GP?