Created
September 9, 2017 00:15
-
-
Save DustinAlandzes/a835909ffd15b9927820d175a48dee41 to your computer and use it in GitHub Desktop.
Approximate Entropy with Python/Numpy: https://en.wikipedia.org/wiki/Approximate_entropy
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np | |
def ApEn(U, m, r): | |
def _maxdist(x_i, x_j): | |
return max([abs(ua - va) for ua, va in zip(x_i, x_j)]) | |
def _phi(m): | |
x = [[U[j] for j in range(i, i + m - 1 + 1)] for i in range(N - m + 1)] | |
C = [len([1 for x_j in x if _maxdist(x_i, x_j) <= r]) / (N - m + 1.0) for x_i in x] | |
return (N - m + 1.0)**(-1) * sum(np.log(C)) | |
N = len(U) | |
return abs(_phi(m + 1) - _phi(m)) | |
# Usage example | |
U = np.array([85, 80, 89] * 17) | |
print ApEn(U, 2, 3) | |
1.0996541105257052e-05 |
Thanks!
A more efficient algorithm, using NumPy builtin functions:
def ApEn_new(U, m, r):
U = np.array(U)
N = U.shape[0]
def _phi(m):
z = N - m + 1.0
x = np.array([U[i:i+m] for i in range(int(z))])
X = np.repeat(x[:, np.newaxis], 1, axis=2)
C = np.sum(np.absolute(x - X).max(axis=2) <= r, axis=0) / z
return np.log(C).sum() / z
return abs(_phi(m + 1) - _phi(m))
Comparing with the results of the original version:
a = [85, 80, 89] * 17
print('ApEn(a): ', ApEn(a))
print('ApEn_new(a):', ApEn_new(a))
# ApEn(a): 1.0996541105257052e-05
# ApEn_new(a): 1.099654110658932e-05
a = np.random.choice([85, 80, 89], size=17 * 3)
print('ApEn(a): ', ApEn(a))
print('ApEn_new(a):', ApEn_new(a))
# ApEn(a): 0.969167624973212
# ApEn_new(a): 0.969167624973212
Comparing with the efficiency of the original version:
import timeit
t = timeit.Timer(stmt="ApEn([85, 80, 89] * 17)", setup=setup)
print(t.timeit(20)) # 1.7361566790000325
t = timeit.Timer(stmt="ApEn_new([85, 80, 89] * 17)", setup=setup)
print(t.timeit(20)) # 0.12356296300004033
🔥
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thank you