Skip to content

Instantly share code, notes, and snippets.

@llSourcell
Created July 10, 2017 13:48
Show Gist options
  • Select an option

  • Save llSourcell/2aa3cc1b8e914fbc67e8aac8d388f2b6 to your computer and use it in GitHub Desktop.

Select an option

Save llSourcell/2aa3cc1b8e914fbc67e8aac8d388f2b6 to your computer and use it in GitHub Desktop.
#first we feed the input image into a convolutional layer to get a feature map
h = self.cnn_layer(X, layer_i=0, border_mode="full") ; X = h
#then we acivate it with a nonlinearity to make sure the math doesn't break (turn all neg numbers to 0)
h = self.relu_layer(X) ; X = h
#another CNN layer for more, smaller images (more hierarchical features = better prediction)
h = self.cnn_layer(X, layer_i=2, border_mode="valid") ; X = h
#another nonlinearity
h = self.relu_layer(X) ; X = h
#pooling to reduce computational cost, extract the most important features
h = self.maxpooling_layer(X) ; X = h
#dropout to prevent overfitting, randomnly turn nodes on and off, force the network to learn multiple
#pathways for data
h = self.dropout_layer(X, .25) ; X = h
#reduce dimensions of an input tensor
h = self.flatten_layer(X) ; X = h
#to combine all learned parts of inputs
h = self.dense_layer(X, layer_i=7) ; X = h
#nonlinearity
h = self.relu_layer(X) ; X = h
#more dropout
h = self.dropout_layer(X, .5) ; X = h
#more combination
h = self.dense_layer(X, layer_i=10) ; X = h
#sigmoid to output probability values
h = self.softmax_layer2D(X) ; X = h
#make the classification based on the probabitlies
max_i = self.classify(X)
#return predicted label
return max_i[0]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment