Skip to content

Instantly share code, notes, and snippets.

@dpiponi
Created June 13, 2017 22:21
Show Gist options
  • Save dpiponi/9df658458dfb31807ee2f2b80aa005e8 to your computer and use it in GitHub Desktop.
Save dpiponi/9df658458dfb31807ee2f2b80aa005e8 to your computer and use it in GitHub Desktop.
Here are some tanh units being self-normalising
import math
import numpy
lambda0 = 1.59254
n = 1000
nlayers = 100
# Incoming activiations have mean 0, variance 1
x = numpy.random.normal(0, 1, n)
# Applying 100 fully connected random layers of 1000 units each
for i in xrange(nlayers):
w = numpy.random.normal(0, 1.0/math.sqrt(n), (n, n))
x = lambda0*numpy.tanh(w.dot(x))
# Mean and variance remain around 0, 1
print numpy.mean(x), numpy.var(x)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment