Skip to content

Instantly share code, notes, and snippets.

@PyDataBlog
Last active October 11, 2020 11:02
Show Gist options
  • Select an option

  • Save PyDataBlog/a47c4601db9a0d5e025ad72851c1b60a to your computer and use it in GitHub Desktop.

Select an option

Save PyDataBlog/a47c4601db9a0d5e025ad72851c1b60a to your computer and use it in GitHub Desktop.
"""
Forward the design matrix through the network layers using the parameters.
"""
function forward_propagate_model_weights(DMatrix, parameters)
master_cache = []
A = DMatrix
L = Int(length(parameters) / 2)
# Forward propagate until the last (output) layer
for l = 1 : (L-1)
A_prev = A
A, cache = linear_forward_activation(A_prev,
parameters[string("W_", (l))],
parameters[string("b_", (l))],
"relu")
push!(master_cache , cache)
end
# Make predictions in the output layer
Ŷ, cache = linear_forward_activation(A,
parameters[string("W_", (L))],
parameters[string("b_", (L))],
"sigmoid")
push!(master_cache, cache)
return Ŷ, master_cache
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment