Skip to content

Instantly share code, notes, and snippets.

@RafayAK
Created June 19, 2019 09:19
Show Gist options
  • Save RafayAK/347bb9192ddb9233f3e6843a6af0209a to your computer and use it in GitHub Desktop.
Save RafayAK/347bb9192ddb9233f3e6843a6af0209a to your computer and use it in GitHub Desktop.
Helper function to initialize weights and biases
import numpy as np
def initialize_parameters(n_in, n_out, ini_type='plain'):
"""
Helper function to initialize some form of random weights and Zero biases
Args:
n_in: size of input layer
n_out: size of output/number of neurons
ini_type: set initialization type for weights
Returns:
params: a dictionary containing W and b
"""
params = dict() # initialize empty dictionary of neural net parameters W and b
if ini_type == 'plain':
params['W'] = np.random.randn(n_out, n_in) *0.01 # set weights 'W' to small random gaussian
elif ini_type == 'xavier':
params['W'] = np.random.randn(n_out, n_in) / (np.sqrt(n_in)) # set variance of W to 1/n
elif ini_type == 'he':
# Good when ReLU used in hidden layers
# Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
# Kaiming He et al. (https://arxiv.org/abs/1502.01852)
# http: // cs231n.github.io / neural - networks - 2 / # init
params['W'] = np.random.randn(n_out, n_in) * np.sqrt(2/n_in) # set variance of W to 2/n
params['b'] = np.zeros((n_out, 1)) # set bias 'b' to zeros
return params
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment