Last active
May 12, 2022 10:58
-
-
Save zhiyzuo/f80e2b1cfb493a5711330d271a228a3d to your computer and use it in GitHub Desktop.
Jensen-Shannon Divergence in Python
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np | |
import scipy as sp | |
def jsd(p, q, base=np.e): | |
''' | |
Implementation of pairwise `jsd` based on | |
https://en.wikipedia.org/wiki/Jensen%E2%80%93Shannon_divergence | |
''' | |
## convert to np.array | |
p, q = np.asarray(p), np.asarray(q) | |
## normalize p, q to probabilities | |
p, q = p/p.sum(), q/q.sum() | |
m = 1./2*(p + q) | |
return sp.stats.entropy(p,m, base=base)/2. + sp.stats.entropy(q, m, base=base)/2. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Just for those who land here looking for jensen shannon distance (using monte carlo integration) between two distributions:
https://stats.stackexchange.com/questions/345915/trying-to-implement-the-jensen-shannon-divergence-for-multivariate-gaussians/419421#419421