Last active
March 13, 2019 07:51
-
-
Save Tooluloope/8c64a14af7d33d652ad5c86125a16096 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def linear_forward(W,A,b): | |
""" | |
Implement the linear part of a layer's forward propagation. | |
Arguments: | |
A -- activations from previous layer (or input data): (size of previous layer, number of examples) | |
W -- weights matrix: numpy array of shape (size of current layer, size of previous layer) | |
b -- bias vector, numpy array of shape (size of the current layer, 1) | |
Returns: | |
Z -- the input of the activation function, also called pre-activation parameter | |
cache -- a python dictionary containing "A", "W" and "b" ; stored for computing the backward pass efficiently | |
""" | |
Z = W.dot(A) + b | |
assert(Z.shape == (W.shape[0], A.shape[1])) | |
cache = (W,A,b) | |
return Z, cache | |
def sigmoid(Z): | |
""" | |
INPUTS: | |
Z: this is WX+b | |
RETURN: | |
σ(Z)=σ(WA+b) =1/1+(e−(Z)) | |
""" | |
A = 1/(1+np.exp(-Z)) | |
return A, Z | |
def relu(Z): | |
""" | |
INPUTS: | |
Z: this is WX+b | |
RETURN: | |
max between 0 and Z | |
""" | |
A = np.maximum(0,Z) | |
assert(A.shape == Z.shape) | |
cache = Z | |
return A, cache | |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment