https://github.com/vkaynig/IACS_ComputeFest_DeepLearning https://news.ycombinator.com/item?id=10901980 https://www.quora.com/What-is-regularization-in-machine-learning https://www.quora.com/Why-does-deep-learning-architectures-use-only-non-linear-activation-function-in-the-hidden-layers http://www.cs.toronto.edu/~fritz/absps/reluICML.pdf https://github.com/twitter/torch-autograd http://www.kdnuggets.com/2015/12/deep-learning-outgrows-bag-words-recurrent-neural-networks.html http://www.nvidia.com/object/jetson-tx1-dev-kit.html http://petewarden.com/2015/05/23/why-are-eight-bits-enough-for-deep-neural-networks/ http://www.wildml.com/2015/12/implementing-a-cnn-for-text-classification-in-tensorflow/
http://arxiv.org/abs/1312.6199 - intriguing properties of neural networks - noise makes NNs choke http://arxiv.org/abs/1512.03385 - 34-layer NN that does insanely well, and up to 1000 layers
https://www.youtube.com/watch?v=S75EdAcXHKk http://deeplearning4j.org/lstm.html LSTMs and RNNs http://neuralnetworksanddeeplearning.com/chap1.html https://medium.com/learning-new-stuff/how-to-learn-neural-networks-758b78f2736e#.1o7ytolrq http://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/ http://www.r2d3.us/visual-intro-to-machine-learning-part-1/
https://www.reddit.com/r/programming/comments/412kqz/a_critique_of_how_to_c_in_2016/