Created
May 24, 2017 18:01
-
-
Save lukedeo/0654e7310432d6d435126c556b863907 to your computer and use it in GitHub Desktop.
SLAC Keras Tutorial
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"cells": [ | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"# Introduction to Deep Learning with Keras\n", | |
"## Workshop on Machine Learning in $b$-tagging.\n", | |
"\n", | |
"#### Luke de Oliveira (5/23/17)\n", | |
"\n", | |
"##### Partner, Manifold / Visiting Researcher, Lawrence Berkeley National Lab" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
" " | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"## Outline for Today\n", | |
"* Navigating the zoo of deep learning libraries — why Keras?\n", | |
"* The right level of abstraction, FP\n", | |
"* Constructing models with the functional API\n", | |
"* Useful guidance for setting up an experiment" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
" " | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
" " | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"## The Zoo of Deep Learning (Frameworks)\n", | |
"\n", | |
"* 95% of ML Twitter -- *“We’re releasing Y, our library we’ve used internally for Deep Learning at X?”*\n", | |
"* Y: TensorFlow, PaddlePaddle, CNTK, PyTorch, Torch, mxnet, Caffe2, Keras, prettytensor, tfslim, …\n", | |
"* X: Google, Deep Mind, Baidu, Microsoft, Facebook, Amazon,…\n", | |
"\n", | |
"It can be hard to know which of the many libraries to use. \n", | |
"\n", | |
"The major differentiator today is how a net is defined:\n", | |
"* config based (Caffe, Caffe2)\n", | |
"* declarative (TensorFlow, Theano, Caffe2)\n", | |
"* imperative (limited TensorFlow, PyTorch, MXNet, Chainer)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
" \n" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
" " | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"## Where does Keras fit in?\n", | |
"* Keras is a minimalist, high level, modular library that sits on top of lower level deep learning libraries\n", | |
"\n", | |
" * TensorFlow, Theano\n", | |
" * Soon-to-be mxnet, CNTK, PyTorch as well!\n", | |
"* Goal of Keras: **define the defacto API** for deep learning to cover 95% of use-cases\n", | |
" * Similar to the way numpy as shaped linear algebra software or ROOT has shaped HEP software\n", | |
" * Drive a common way of thinking about deep learning development\n", | |
"* Make 0-to-60 trivial, but allow for advanced researchers to extend" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
" " | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
" " | |
] | |
}, | |
{ | |
"attachments": {}, | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"## Reasoning About Deep Learning\n", | |
"\n", | |
"* **DO** think about nonlinear operators on matrices / tensors\n", | |
"* **DON’T** think about neurons or biological inspiration\n", | |
"* Better term for the field: differentiable networks\n", | |
"\n", | |
"In general, think about an operation in a deep network as an operator $T$ acting on a vector $x$, $x\\mapsto T(x)$.\n", | |
"\n", | |
"We want to build a **directed, acyclic graph** of these operations.\n", | |
"\n", | |
"![dag](https://i.stack.imgur.com/zuLmn.png![image.png](attachment:image.png)\n", | |
"\n", | |
"The only thing we have to keep in mind for Keras is that we *must* have an input defined for a graph. This is done using a special layer called an `Input`" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Let's import some data to get started! We'll use the Boston housing dataset because it is small and nicely interpretable." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 1, | |
"metadata": {}, | |
"outputs": [ | |
{ | |
"name": "stderr", | |
"output_type": "stream", | |
"text": [ | |
"Using TensorFlow backend.\n" | |
] | |
} | |
], | |
"source": [ | |
"from keras.datasets.boston_housing import load_data" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 2, | |
"metadata": {}, | |
"outputs": [], | |
"source": [ | |
"(X_train, y_train), (X_test, y_test) = load_data()" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 3, | |
"metadata": {}, | |
"outputs": [ | |
{ | |
"data": { | |
"text/plain": [ | |
"(102, 13)" | |
] | |
}, | |
"execution_count": 3, | |
"metadata": {}, | |
"output_type": "execute_result" | |
} | |
], | |
"source": [ | |
"X_test.shape" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 4, | |
"metadata": {}, | |
"outputs": [ | |
{ | |
"data": { | |
"text/plain": [ | |
"(102,)" | |
] | |
}, | |
"execution_count": 4, | |
"metadata": {}, | |
"output_type": "execute_result" | |
} | |
], | |
"source": [ | |
"y_test.shape" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"# Building a DNN in Keras\n", | |
"\n", | |
"A supervised neural network, at it's core, consists of three components.\n", | |
"\n", | |
"1. A computation graph, i.e., the network itself with all of it's layers\n", | |
"1. A loss function to penalize a problem dependent notion of incorrectness\n", | |
" 1. mean squared error, mean absolute error, huber\n", | |
" 1. cross entropy, KL-divergence\n", | |
"1. An optimizer, or a way to learn the parameters of your model with respect to a loss\n", | |
" 1. Adam, Adagrad, RMSProp, Adadelta, Vanilla SGD, etc.\n", | |
" 1. Ability to anneal\n", | |
"\n", | |
"Let's start from the canonical version of a DNN, a simple multilayer perceptron-like architecture. Building this structure, i.e., a linear sequence of non-linear transforms, is very easy in Keras.\n", | |
"\n", | |
"Let's make a network that reads in the 13 inputs from the Boston Housing dataset, has two hidden layers, and one linear output. We'll think about layers as mathematical functions using the Keras **functional API**." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 5, | |
"metadata": {}, | |
"outputs": [], | |
"source": [ | |
"from keras.layers import Dense, Input, Activation\n", | |
"from keras.models import Model\n", | |
"from keras.utils import plot_model" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 6, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"from matplotlib import pyplot as plt\n", | |
"%matplotlib inline" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Let's look at point #1 from above" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 7, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"# we define the input shape (i.e., how many input features) **without** the batch size\n", | |
"x = Input(shape=(13, ))\n", | |
"\n", | |
"# all Keras Ops look like z = f(z) (like functional programming)\n", | |
"h = Dense(20)(x)\n", | |
"h = Activation('relu')(h)\n", | |
"\n", | |
"h = Dense(20)(h)\n", | |
"h = Activation('relu')(h)\n", | |
"\n", | |
"# our output is a single number, the house price.\n", | |
"y = Dense(1)(h)\n", | |
"\n", | |
"# A model is a conta\n", | |
"net = Model(x, y)" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 8, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"plot_model(net, to_file='basic.png')" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"![basic](./basic.png)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"To make this neural net useful, we need to **compile** it into a DAG (directed, acyclic graph). This is the **declarative** approach to deep learning. This compilation takes all of your ops, your loss, and your model, and constructs all of the operations to do gradient updates." | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Let's use the Adam [`[arXiv/1412.6980]`](https://arxiv.org/abs/1412.6980) optimizer with a mean squared error loss function, i.e., $\\Vert y - \\hat{y} \\Vert ^ 2$" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 9, | |
"metadata": {}, | |
"outputs": [], | |
"source": [ | |
"net.compile(optimizer='adam', loss='mse')" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Now, you're going to want to fit the model to your data. This is made very easy by the `.fit` member function of a model object. We might want to leave out a subset of our data, say 20%, for validation.\n", | |
"\n", | |
"In addition, we might want to save a network and impose an ability to stop when a validation loss stops decreasing. Keras introduces this with `callbacks`." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 10, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"from keras.callbacks import ModelCheckpoint, EarlyStopping" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 11, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"callbacks = [\n", | |
" # if we don't have a decrease of the loss for 10 epochs, terminate training.\n", | |
" EarlyStopping(verbose=True, patience=10, monitor='val_loss'), \n", | |
" # Always make sure that we're saving the model weights with the best val loss.\n", | |
" ModelCheckpoint('model.h5', monitor='val_loss', verbose=True, save_best_only=True)]" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 12, | |
"metadata": { | |
"scrolled": false | |
}, | |
"outputs": [ | |
{ | |
"name": "stdout", | |
"output_type": "stream", | |
"text": [ | |
"Train on 323 samples, validate on 81 samples\n", | |
"Epoch 1/60\n", | |
"Epoch 00000: val_loss improved from inf to 134.14750, saving model to model.h5\n", | |
"0s - loss: 468.0829 - val_loss: 134.1475\n", | |
"Epoch 2/60\n", | |
"Epoch 00001: val_loss improved from 134.14750 to 71.48350, saving model to model.h5\n", | |
"0s - loss: 100.2806 - val_loss: 71.4835\n", | |
"Epoch 3/60\n", | |
"Epoch 00002: val_loss improved from 71.48350 to 64.00830, saving model to model.h5\n", | |
"0s - loss: 65.9053 - val_loss: 64.0083\n", | |
"Epoch 4/60\n", | |
"Epoch 00003: val_loss improved from 64.00830 to 51.15335, saving model to model.h5\n", | |
"0s - loss: 52.1776 - val_loss: 51.1534\n", | |
"Epoch 5/60\n", | |
"Epoch 00004: val_loss did not improve\n", | |
"0s - loss: 44.8227 - val_loss: 52.3701\n", | |
"Epoch 6/60\n", | |
"Epoch 00005: val_loss improved from 51.15335 to 49.87214, saving model to model.h5\n", | |
"0s - loss: 41.4629 - val_loss: 49.8721\n", | |
"Epoch 7/60\n", | |
"Epoch 00006: val_loss improved from 49.87214 to 48.20905, saving model to model.h5\n", | |
"0s - loss: 39.9025 - val_loss: 48.2091\n", | |
"Epoch 8/60\n", | |
"Epoch 00007: val_loss did not improve\n", | |
"0s - loss: 38.2047 - val_loss: 48.7473\n", | |
"Epoch 9/60\n", | |
"Epoch 00008: val_loss did not improve\n", | |
"0s - loss: 37.4430 - val_loss: 48.2525\n", | |
"Epoch 10/60\n", | |
"Epoch 00009: val_loss improved from 48.20905 to 46.52702, saving model to model.h5\n", | |
"0s - loss: 36.1604 - val_loss: 46.5270\n", | |
"Epoch 11/60\n", | |
"Epoch 00010: val_loss improved from 46.52702 to 44.79909, saving model to model.h5\n", | |
"0s - loss: 35.0108 - val_loss: 44.7991\n", | |
"Epoch 12/60\n", | |
"Epoch 00011: val_loss improved from 44.79909 to 44.11574, saving model to model.h5\n", | |
"0s - loss: 34.4464 - val_loss: 44.1157\n", | |
"Epoch 13/60\n", | |
"Epoch 00012: val_loss did not improve\n", | |
"0s - loss: 33.2338 - val_loss: 45.2777\n", | |
"Epoch 14/60\n", | |
"Epoch 00013: val_loss improved from 44.11574 to 43.68342, saving model to model.h5\n", | |
"0s - loss: 32.4783 - val_loss: 43.6834\n", | |
"Epoch 15/60\n", | |
"Epoch 00014: val_loss improved from 43.68342 to 42.46121, saving model to model.h5\n", | |
"0s - loss: 32.4278 - val_loss: 42.4612\n", | |
"Epoch 16/60\n", | |
"Epoch 00015: val_loss did not improve\n", | |
"0s - loss: 30.9855 - val_loss: 42.6724\n", | |
"Epoch 17/60\n", | |
"Epoch 00016: val_loss improved from 42.46121 to 40.32434, saving model to model.h5\n", | |
"0s - loss: 31.4430 - val_loss: 40.3243\n", | |
"Epoch 18/60\n", | |
"Epoch 00017: val_loss did not improve\n", | |
"0s - loss: 29.6636 - val_loss: 40.7080\n", | |
"Epoch 19/60\n", | |
"Epoch 00018: val_loss improved from 40.32434 to 40.00365, saving model to model.h5\n", | |
"0s - loss: 31.8790 - val_loss: 40.0036\n", | |
"Epoch 20/60\n", | |
"Epoch 00019: val_loss improved from 40.00365 to 39.47815, saving model to model.h5\n", | |
"0s - loss: 30.5705 - val_loss: 39.4782\n", | |
"Epoch 21/60\n", | |
"Epoch 00020: val_loss improved from 39.47815 to 37.76539, saving model to model.h5\n", | |
"0s - loss: 29.7597 - val_loss: 37.7654\n", | |
"Epoch 22/60\n", | |
"Epoch 00021: val_loss did not improve\n", | |
"0s - loss: 29.2559 - val_loss: 41.1969\n", | |
"Epoch 23/60\n", | |
"Epoch 00022: val_loss did not improve\n", | |
"0s - loss: 30.9832 - val_loss: 38.2094\n", | |
"Epoch 24/60\n", | |
"Epoch 00023: val_loss did not improve\n", | |
"0s - loss: 31.8329 - val_loss: 40.6603\n", | |
"Epoch 25/60\n", | |
"Epoch 00024: val_loss did not improve\n", | |
"0s - loss: 28.6760 - val_loss: 39.3337\n", | |
"Epoch 26/60\n", | |
"Epoch 00025: val_loss did not improve\n", | |
"0s - loss: 28.1073 - val_loss: 37.9025\n", | |
"Epoch 27/60\n", | |
"Epoch 00026: val_loss did not improve\n", | |
"0s - loss: 28.5753 - val_loss: 37.8779\n", | |
"Epoch 28/60\n", | |
"Epoch 00027: val_loss improved from 37.76539 to 35.05966, saving model to model.h5\n", | |
"0s - loss: 26.7014 - val_loss: 35.0597\n", | |
"Epoch 29/60\n", | |
"Epoch 00028: val_loss improved from 35.05966 to 34.95771, saving model to model.h5\n", | |
"0s - loss: 26.1210 - val_loss: 34.9577\n", | |
"Epoch 30/60\n", | |
"Epoch 00029: val_loss did not improve\n", | |
"0s - loss: 24.9666 - val_loss: 36.4759\n", | |
"Epoch 31/60\n", | |
"Epoch 00030: val_loss improved from 34.95771 to 34.25068, saving model to model.h5\n", | |
"0s - loss: 24.3774 - val_loss: 34.2507\n", | |
"Epoch 32/60\n", | |
"Epoch 00031: val_loss did not improve\n", | |
"0s - loss: 23.7740 - val_loss: 34.3869\n", | |
"Epoch 33/60\n", | |
"Epoch 00032: val_loss did not improve\n", | |
"0s - loss: 23.7496 - val_loss: 34.2800\n", | |
"Epoch 34/60\n", | |
"Epoch 00033: val_loss improved from 34.25068 to 33.45103, saving model to model.h5\n", | |
"0s - loss: 23.4304 - val_loss: 33.4510\n", | |
"Epoch 35/60\n", | |
"Epoch 00034: val_loss did not improve\n", | |
"0s - loss: 22.8933 - val_loss: 33.7358\n", | |
"Epoch 36/60\n", | |
"Epoch 00035: val_loss improved from 33.45103 to 32.27490, saving model to model.h5\n", | |
"0s - loss: 22.6854 - val_loss: 32.2749\n", | |
"Epoch 37/60\n", | |
"Epoch 00036: val_loss improved from 32.27490 to 30.81804, saving model to model.h5\n", | |
"0s - loss: 23.2255 - val_loss: 30.8180\n", | |
"Epoch 38/60\n", | |
"Epoch 00037: val_loss did not improve\n", | |
"0s - loss: 22.4755 - val_loss: 33.0621\n", | |
"Epoch 39/60\n", | |
"Epoch 00038: val_loss did not improve\n", | |
"0s - loss: 21.8219 - val_loss: 30.9690\n", | |
"Epoch 40/60\n", | |
"Epoch 00039: val_loss did not improve\n", | |
"0s - loss: 21.8557 - val_loss: 31.3051\n", | |
"Epoch 41/60\n", | |
"Epoch 00040: val_loss did not improve\n", | |
"0s - loss: 24.0296 - val_loss: 31.4590\n", | |
"Epoch 42/60\n", | |
"Epoch 00041: val_loss improved from 30.81804 to 29.62165, saving model to model.h5\n", | |
"0s - loss: 21.0067 - val_loss: 29.6217\n", | |
"Epoch 43/60\n", | |
"Epoch 00042: val_loss improved from 29.62165 to 28.99235, saving model to model.h5\n", | |
"0s - loss: 20.3704 - val_loss: 28.9924\n", | |
"Epoch 44/60\n", | |
"Epoch 00043: val_loss did not improve\n", | |
"0s - loss: 20.3697 - val_loss: 30.1685\n", | |
"Epoch 45/60\n", | |
"Epoch 00044: val_loss improved from 28.99235 to 28.73855, saving model to model.h5\n", | |
"0s - loss: 20.4590 - val_loss: 28.7385\n", | |
"Epoch 46/60\n", | |
"Epoch 00045: val_loss improved from 28.73855 to 28.36430, saving model to model.h5\n", | |
"0s - loss: 19.2068 - val_loss: 28.3643\n", | |
"Epoch 47/60\n", | |
"Epoch 00046: val_loss improved from 28.36430 to 27.92112, saving model to model.h5\n", | |
"0s - loss: 19.6644 - val_loss: 27.9211\n", | |
"Epoch 48/60\n", | |
"Epoch 00047: val_loss did not improve\n", | |
"0s - loss: 18.4843 - val_loss: 27.9733\n", | |
"Epoch 49/60\n", | |
"Epoch 00048: val_loss improved from 27.92112 to 26.66096, saving model to model.h5\n", | |
"0s - loss: 18.7995 - val_loss: 26.6610\n", | |
"Epoch 50/60\n", | |
"Epoch 00049: val_loss improved from 26.66096 to 25.85404, saving model to model.h5\n", | |
"0s - loss: 18.3146 - val_loss: 25.8540\n", | |
"Epoch 51/60\n", | |
"Epoch 00050: val_loss improved from 25.85404 to 25.60145, saving model to model.h5\n", | |
"0s - loss: 17.7337 - val_loss: 25.6014\n", | |
"Epoch 52/60\n", | |
"Epoch 00051: val_loss improved from 25.60145 to 25.08804, saving model to model.h5\n", | |
"0s - loss: 17.7267 - val_loss: 25.0880\n", | |
"Epoch 53/60\n", | |
"Epoch 00052: val_loss did not improve\n", | |
"0s - loss: 17.4104 - val_loss: 25.9487\n", | |
"Epoch 54/60\n", | |
"Epoch 00053: val_loss improved from 25.08804 to 24.87356, saving model to model.h5\n", | |
"0s - loss: 17.1968 - val_loss: 24.8736\n", | |
"Epoch 55/60\n", | |
"Epoch 00054: val_loss improved from 24.87356 to 24.43540, saving model to model.h5\n", | |
"0s - loss: 16.7500 - val_loss: 24.4354\n", | |
"Epoch 56/60\n", | |
"Epoch 00055: val_loss improved from 24.43540 to 24.03485, saving model to model.h5\n", | |
"0s - loss: 16.6910 - val_loss: 24.0349\n", | |
"Epoch 57/60\n", | |
"Epoch 00056: val_loss did not improve\n", | |
"0s - loss: 17.4414 - val_loss: 25.3191\n", | |
"Epoch 58/60\n", | |
"Epoch 00057: val_loss improved from 24.03485 to 23.97303, saving model to model.h5\n", | |
"0s - loss: 16.7362 - val_loss: 23.9730\n", | |
"Epoch 59/60\n", | |
"Epoch 00058: val_loss improved from 23.97303 to 22.57416, saving model to model.h5\n", | |
"0s - loss: 15.9978 - val_loss: 22.5742\n", | |
"Epoch 60/60\n", | |
"Epoch 00059: val_loss did not improve\n", | |
"0s - loss: 16.4507 - val_loss: 22.8278\n" | |
] | |
} | |
], | |
"source": [ | |
"history = net.fit(X_train, y_train, validation_split=0.2, epochs=60, verbose=2, callbacks=callbacks)" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 13, | |
"metadata": {}, | |
"outputs": [ | |
{ | |
"data": { | |
"text/plain": [ | |
"<matplotlib.legend.Legend at 0x10eed8e90>" | |
] | |
}, | |
"execution_count": 13, | |
"metadata": {}, | |
"output_type": "execute_result" | |
}, | |
{ | |
"data": { | |
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXoAAAEACAYAAAC9Gb03AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xt0HOWZ5/Hvo+7W/WJZtmVbsnyRbTAEQ4BxuBzATIIT\nZ3aAmYxnMIQkTGbOcMKchMkOEyeZrM2BzQnZIcNwlh02Z9gEWC7JZDaELPdgC7wkQCAmNnffL5Is\nXyVLstSSut/9463WxZaxbKlcrfbvc857qrrcXf2+dvupep/3rSpzziEiIrkrL+oKiIhIuBToRURy\nnAK9iEiOU6AXEclxCvQiIjlOgV5EJMeNKNCb2TYz+72ZrTOz14NtlWb2vJl9YGbPmVnFoPffa2Yb\nzewtMzsvrMqLiMjxjfSMPg0sds593Dm3KNi2AviVc+4MYDXwTQAzWwrUO+fmAX8D3D/GdRYRkRMw\n0kBvw7z3GuDBYP3B4HVm+0MAzrnXgAozqx5lPUVE5CSNNNA74Dkz+62Z/VWwrdo51wLgnNsNTAm2\n1wA7B322MdgmIiIRiI/wfZc453ab2WTgeTP7AB/8h2PDbNN9FkREIjKiQB+cseOc22tmTwCLgBYz\nq3bOtZjZVGBP8PZdwIxBH68Fmo7cp5kp+IuInATn3HAn1Md03NSNmRWbWWmwXgIsATYATwJfCt72\nJeAXwfqTwBeC918EtGZSPMNUNmfLypUrI6+D2qe2qX25V07GSM7oq4GfB2fgceAR59zzZvYG8FMz\n+0tgB7AsCN5Pm9lnzWwT0AncdFI1ExGRMXHcQO+c2wocNRfeOXcA+NQxPvO3o6+aiIiMBV0ZG5LF\nixdHXYVQ5XL7crltoPadjuxkcz6j/mIzF9V3i4iMV2aGO8HB2JFOrxQRYdasWWzfvj3qapwWZs6c\nybZt28ZkXzqjF5ERC84mo67GaeFYf9cnc0avHL2ISI5ToBcRyXEK9CIiOU6BXkQkxynQi8hp7aWX\nXmLGjBnHfd/s2bNZvXr1KajR2FOgF5HTntkJTWIZdxToRURynAK9iOSEu+66i2XLlg3Zduutt3Lr\nrbfy4x//mLPOOovy8nLmzp3LD3/4w1F9V09PD7feeis1NTXU1tbyd3/3d/T29gKwf/9+/viP/5jK\nykqqqqq44oorhtSxtraW8vJyFixYwJo1a0ZVj5HSlbEiMqbGKgtyotdlLV++nDvuuIOOjg5KS0tJ\np9P89Kc/5YknnmD//v089dRTzJ49m7Vr1/KZz3yGRYsWcd55R92vcUTuvPNOXn/9ddavXw/A1Vdf\nzZ133sntt9/O3XffzYwZM9i/fz/OOV599VUAPvzwQ+677z7efPNNqqur2bFjB6lU6qS+/0TpjF5E\nxpRzY1NOVF1dHeeffz5PPPEEAC+++CIlJSUsWrSIpUuXMnv2bAAuu+wylixZwtq1a0+6jY8++igr\nV66kqqqKqqoqVq5cycMPPwxAIpGgubmZrVu3EovFuPTSSwGIxWL09PTw9ttv09fXR11dXX+dwqZA\nLyI5Y/ny5Tz22GMAPPbYY1x//fUAPPPMM1x88cVUVVVRWVnJM888w759+076e5qamqirq+t/PXPm\nTJqa/IP0brvtNurr61myZAlz587lrrvuAqC+vp577rmHVatWUV1dzfXXX09zc/NJ1+FEKNCLSM5Y\ntmwZDQ0NNDY28vOf/5wbbriBnp4e/uzP/ox/+Id/YO/evRw8eJClS5eO6p4906dPH3Jzt+3btzN9\n+nQASktL+ad/+ic2b97ML3/5S37wgx/05+Kvu+461q5d2//ZFStWjKK1I6dALyI5Y9KkSVxxxRXc\ndNNNzJkzh/nz59PT00NPTw+TJk0iLy+PZ555hueff35U37N8+XLuvPNO9u3bx759+7jjjju48cYb\nAXjqqafYvHkz4IN+PB4nFovx4YcfsmbNGnp6esjPz6eoqIhYLDbqNo+EAr2I5JTrr7+eF198kRtu\nuAHwwfbee+9l2bJlTJw4kccff5xrrrnmhPc7eK79P/7jP3LhhReycOFCzj33XC688EK+/e1vA7Bx\n40Y+9alPUVZWxqWXXsott9zC5ZdfTjKZZMWKFUyePJnp06ezd+9evvvd745No49Xd92mWERGSrcp\nPnV0m2IRERkxBXoREWDnzp2UlZVRXl7eXzKvd+3aFXX1RkWpGxEZMaVuTp2cSd2kXTrKrxcROS1E\nGui7+7qj/HoRkdNCpIG+q7cryq8XETktRBvo+xToRUTCptSNiOSM8fwUqDApdSMikuOUuhERyXFK\n3YhIzhlvT4AKW6RPmFLqRkTCMN6eABW2aAO9UjciOcduH5tnCbqVJ38F7qOPPsp9991HVVUVACtX\nruTmm2/m9ttvH/IEqPr6+mGfAFVVVTXkwSLjXaSBXqkbkdwzmgA9WpnbBhzvCVCrVq1iyZIlmBl/\n/dd/zTe+8Y0hT4B69913+fSnP83dd9/NtGnTomrOmNGsGxHJKWZGTU3NuHoCVNg060ZEckbmJmDX\nXXfduHoCVNh0Ri8iOSPzFKjvfOc7XHDBBePmCVBhG/Ftis0sD3gD2OWcu9rMZgGPA5XA74AbnXN9\nZpYPPARcAOwD/sI5t2OY/bk7X7qTb1/+7TFpiIiET7cpPnWiuk3x14B3B72+C7jbOXcG0Ap8Odj+\nZeCAc24ecA/w/WPtUKkbEZHwjSjQm1kt8Fng3wZt/kPgP4L1B4Frg/VrgtcAPwM+eaz9KnUjIhK+\nkZ7R/zNwG+AAzKwKOOhc/5NDdgE1wXoNsBPAOZcCWs1s4nA71fRKEZHwHXcevZn9EdDinHvLzBZn\nNgdlMDfoz4bsYtCfDfHrh3/Nqt+uAmDx4sUsXrx4uLeJiJy2GhoaaGhoGNU+jjsYa2bfBT4P9AFF\nQBnwBLAEmOqcS5vZRcBK59xSM3s2WH/NzGJAs3NuyjD7ddf97Doe+9xjo2qAiJw6Gow9dU7pYKxz\n7lvOuTrn3BzgOmC1c+7zwBpgWfC2LwK/CNafDF4T/Pkxbw6t1I2ISPhGcwuEFcDjZnYHsA54INj+\nAPCwmW0E9uMPDsPSYKzI+DJz5sz+ueoSrpkzZ47Zvk4o0DvnXgJeCta3Ap8Y5j1J4M9Hsj9NrxQZ\nX7Zt2xZ1FeQk6H70IiI5TrdAEBHJcbqpmYhIjtMZvYhIjlOOXkQkxyl1IyKS4yJP3egqOxGRcEUa\n6GN5MXrTvVFWQUQk50Ua6IviRRqQFREJWbSBPlGkPL2ISMgiP6PXzBsRkXBFGugL44VK3YiIhEyp\nGxGRHBd56kZn9CIi4Yo8daMcvYhIuJS6ERHJcUrdiIjkOKVuRERyXPRn9ErdiIiEKvocvVI3IiKh\nivyMXqkbEZFwRZ6jV+pGRCRcSt2IiOS4yFM3OqMXEQlX5Kkb5ehFRMIVfepGZ/QiIqGKPnWjHL2I\nSKiUuhERyXFK3YiI5DilbkREclzkZ/RK3YiIhCvyHL1SNyIi4VLqRkQkx0WeutEZvYhIuCJP3ShH\nLyISruMGejMrMLPXzGydmW0ws5XB9llm9qqZfWBmj5lZPNieb2aPm9lGM/uNmdUda99K3YiIhO+4\ngd45lwSudM59HDgPWGpmnwDuAu52zp0BtAJfDj7yZeCAc24ecA/w/WPtOz+WT1+6j1Q6NcpmiIjI\nsYwodeOcOxysFgBxwAFXAv8RbH8QuDZYvyZ4DfAz4JPH2q+ZKX0jIhKyEQV6M8szs3XAbuAFYDPQ\n6pxLB2/ZBdQE6zXATgDnXApoNbOJx9q3BmRFRMI10jP6dJC6qQUWAQuGe1uwtCO226A/O4ry9CIi\n4YqfyJudc4fM7CXgImCCmeUFZ/W1QFPwtl3ADKDJzGJAuXPu4HD7W7VqFYdfO8z3d3+fz332cyxe\nvPikGyIikosaGhpoaGgY1T7MuWOebPs3mE0Cep1zbWZWBDwHfA/4IvB/nHM/MbN/BX7vnLvfzL4C\nfMw59xUzuw641jl33TD7dc45zvnXc3jkTx9hYfXCUTVEROR0YGY4547MnHykkZzRTwMeNLM8fKrn\nJ865p83sPeBxM7sDWAc8ELz/AeBhM9sI7AeOCvKDKXUjIhKu4wZ659wG4Pxhtm8FPjHM9iTw5yOt\ngAZjRUTCFemVsaCrY0VEwhZ5oFfqRkQkXNEHeqVuRERCFXmgV+pGRCRckQd6pW5ERMKVHYFeqRsR\nkdBEH+j13FgRkVBFHugL44VK3YiIhCjyQK/UjYhIuKIP9AkNxoqIhCnyQF8YL6Q7pRy9iEhYIg/0\nml4pIhKu6AO9rowVEQlV9IE+rumVIiJhijzQa3qliEi4Ig/0St2IiIQr+kCv1I2ISKgiD/RK3YiI\nhCvyQK/UjYhIuKIP9JpHLyISqsgDvR48IiISrsgDfeY2xc65qKsiIpKTIg/0eZZHIpYgmUpGXRUR\nkZwUeaAHTbEUEQlTVgR6TbEUEQlPVgR6TbEUEQlPdgR6pW5EREKTFYFeqRsRkfBkRaBX6kZEJDzZ\nEeh1dayISGiyItDr6lgRkfBkRaBX6kZEJDzZEeiVuhERCU3WBHqlbkREwpEVgb4wXqjUjYhISLIi\n0BcllLoREQnLcQO9mdWa2Woze9fMNpjZV4PtlWb2vJl9YGbPmVnFoM/ca2YbzewtMzvveN+h1I2I\nSHhGckbfB3zdOXcWcDFwi5mdCawAfuWcOwNYDXwTwMyWAvXOuXnA3wD3H+8LlLoREQnPcQO9c263\nc+6tYL0DeA+oBa4BHgze9mDwmmD5UPD+14AKM6v+qO9Q6kZEJDwnlKM3s1nAecCrQLVzrgX8wQCY\nErytBtg56GONwbZjKoprHr2ISFhGHOjNrBT4GfC14Mz+WM/+s2G2feRzAnVlrIhIeOIjeZOZxfFB\n/mHn3C+CzS1mVu2cazGzqcCeYPsuYMagj9cCTcPtd9WqVQC8u/dddk/aDX964g0QEcllDQ0NNDQ0\njGofNpKHcpvZQ8A+59zXB227CzjgnLvLzFYAE5xzK8zss8Atzrk/MrOLgHuccxcNs0+X+e6nPnyK\n+357H0/f8PSoGiMikuvMDOfccJmTYzruGb2ZXQrcAGwws3X4NMy3gLuAn5rZXwI7gGUAzrmnzeyz\nZrYJ6ARuOt53FCU0vVJEJCzHDfTOuVeA2DH++FPH+MzfnkglNL1SRCQ82XFlrG5qJiISmuwI9Erd\niIiEJisCvVI3IiLhyYpAr9SNiEh4siPQ6wlTIiKhyYpArytjRUTCkxWBPpGXIO3S9KX7oq6KiEjO\nyYpAb2bK04uIhCQrAj1oiqWISFiyJtBriqWISDiyJtArdSMiEo7sCfRK3YiIhCJrAr1SNyIi4cia\nQK/UjYhIOLIn0OvqWBGRUGRPoI8rRy8iEoasCfSF8UKlbkREQpA1gb4ortSNiEgYsifQa3qliEgo\nsibQK3UjIhKOrAn0St2IiIQjewK9UjciIqHImkCv1I2ISDiyJtArdSMiEo7sCfS6MlZEJBTZE+h1\nZayISCiyJtArRy8iEo5IA31T08C6UjciIuGINNC/+ebAulI3IiLhyJpAr9SNiEg4sibQK3UjIhKO\n7An0St2IiIQi0kCfTEJzs19X6kZEJByRBvoLLhg4q1fqRkQkHNkT6PVwcBGRUGRNoC+MF9KT6sE5\nF2WVRERyznEDvZk9YGYtZrZ+0LZKM3vezD4ws+fMrGLQn91rZhvN7C0zO++j9j040JsZ+bF8DciK\niIyxkZzR/wj49BHbVgC/cs6dAawGvglgZkuBeufcPOBvgPs/asezZkF3N+ze7V8rTy8iMvaOG+id\nc/8POHjE5muAB4P1B4PXme0PBZ97Dagws+pj7dsMzj9/aJ5eZ/QiImPrZHP0U5xzLQDOud3AlGB7\nDbBz0Psag23HdMEF8MYbfl1TLEVExl58jPdnw2w75ujqqlWr2LwZ1q+HK65YzILJC3h116vUT6wf\n42qJiIxPDQ0NNDQ0jGofNpJZLmY2E/ilc25h8Po9YLFzrsXMpgJrnHMLzOz+YP0nwfveB67InP0f\nsU/nnGPLFrjsMmhshEc3PMpDv3+IZz//7KgaJSKSq8wM59xwJ9XHNNLUjTH0bP1J4EvB+peAXwza\n/oWgMhcBrcMF+cFmz4auLj8ge+2Z1/Ja42s0tzePsFoiInI8I5le+Sjwa2C+me0ws5uA7wFXmdkH\nwCeD1zjnnga2mtkm4H8CXzn+/gemWRYnirn2zGt5dMOjo2iSiIgMNqLUTShfHKRuAFasgOJi+C//\nBVZvXc3Xn/s6b938ViT1EhHJZmGmbkI1+MKpxbMWs79rPxtaNkRbKRGRHJF1gT7P8rjhnBt4eP3D\n0VZKRCRHZEWgnz0bDh+GlmDY9saFN/LIhkdIpVPRVkxEJAdkRaA/8grZs6ecTXVJNWu2rYm2YiIi\nOSArAj0MTd8AfOHcLyh9IyIyBrI20C//2HKe/OBJOns6o6uUiEgOyNpAX11azcW1F/PE+09EVykR\nkRyQNYF+zhzo6IDf/GZg240Lb1T6RkRklLIm0JvB/ffDn/wJ3Hwz7N8P15x5jW6JICIySlkT6AH+\n4i/gvfcgHoezzoJHflzMNWfolggiIqORFbdAGM5bb8Ett8CB0l+z/8plvH7zK8yaMOvUVVBEJAuN\n21sgDOe882DtWlhx/SV0vfBNLv/hp9nbuTfqaomIjDtZe0Y/2IsvwtX//B3qPvksr39lNWUFZSHX\nTkQkO53MGf24CPQAzzzj+NMf3czCK7aw9uanyI/lh1g7EZHslFOpmyMtXWo89vn/we9fL+PqH32R\ntEtHXSURkXFh3AR6gGuvjvGj//Qoq3/bzI2P3MqRPYJUCjp1Ia2IyBDjJnUz2AP/u42bX1tMbXE9\nszfeTdv2mTQ3w759EIvBtGn+SttMOf98qKoa4waIiEQgp3P0R/q/z3bxw3f+Gw1d97K8/qvcdslt\nzJxeRCwGGzf62ym8+Sa88QasWwclJVBf76/AzZS6Oujr872AwSWV8k+8Kinxy8x6TQ3MnAmJxBj+\nRYiInIDTKtBnbG/dzt+/8Pe80fQGP1jyA64981rMhv4dpNPQ1ARbtgwtO3b4oF1SMrTEYv6B5YcP\n+8B/+LC/PcPOnX4/tbX+oDF37tDlnDn+oCAiEpbTMtBnvLjlRb767FepKqricws+x1X1V7Fg0oKj\ngv5o9fTAtm2waRNs3jyw3LwZtm71KaL6epg6FcrKoLx8oBQW+vTS7t3+ISu7d/sSi8GsWf4BLIOX\nNTV+P/maYCQigdM60AP0pnp54v0neH7z87yw5QV6071cNecqrppzFX9Q8wfUVdRRGC8c0+8cLJWC\nxkYf9PfuhUOHhpauLpg0yQfvwaWvzx88tm4dWG7d6nsPe/b4A8a0ab5MmOC/p7fXfy5TZs70F5md\ndx6cey5UVobWzGE559unHo1IuE77QD+Yc45NBzbxqy2/4oUtL7BhzwZ2tu1kYtFEZk2Y1V/mVM5h\nTuUc6ivrqS2vJZYXC61OJyOd9r2A5mZf2tp8uikeH1jm5fkDw7p1/tYR69f7A8q8ef6g0N3tSzLp\nl/E4VFQM9DQqKvzB5MgUVnGx3099/fBjE3v2wAsvwHPP+eXBg3DmmXD55XDZZb5Mnerf29wMv/ud\nL+vW+Z7QpEm+11Jb65eZ9RkzYMoU3y4RGUqB/jhS6RTNHc1sa93GttZtbD24lS2tW9h8YDNbDm5h\n3+F91FXUMaNiBhMKJ1BRUMGEwgn9pbqkmull06kpr2Fa6TSKEkWntP4jlU4PpJUSCZ8yypSCAt8D\nOHTIHzQyvY22toHB6MzYRGenD+abN/tAnRmbqK31B5StW+HKK2HJErjqKh+g33zT37pi7Vp45RWY\nONHvp6fHz37KlPnz/QGssXFo2bXLj520t/vAP2OGX2bSV5lMnJnf98UX+zJtWnR/3yKnkgL9KHX1\ndrG1dSuNhxppS7bR1t1Ga3crbck2DnYdpKWzhcb2Rpram2hqb6IkUcL0sulUl1YztXQqU0um9q9P\nKJxAaX4pJYkSSvNLKc0vpaygjMrCyjEfNzgVMmMTmzf7QPyxj8GiRR89Aymdhnff9b2FurqBID0S\nXV0DQb+pyR+cMj+XzLK52T+/4De/8b2SSy6Biy7yPYW8PF9iMb9MJv2+tm8fWsrLYeFCOOccv1y4\nEM44w/d6Uilf0mm/dG5gf0cWkVNFgf4Ucs6xv2s/Te1NtHS0sLtjNy2dfrm7YzdtyTY6ejro7On0\ny95O2rrb6Orrorqkmmll05haOpVppdOYXDyZisKKIb2I8oJy4nlxzAzD+pfFiWJqy2uztjcRhXQa\nPvwQfv1reO0130PJBOd02pdEwvcOZs4cKHV1viezYYNPd61f79c3bfKfzRwoMsHdbGB/mZJK+TTX\nxIl+XCRTiov9OEpPz8Cyp8e/r65uaJkxw6e4jnXAcM4f1Nav93WbMQPOPtsP2seyK9Mop4AC/TjQ\n3dfN7o7dNLc309zRTHN7M/sO7xvoQSRbaetuoy3ZRiqdwuFwzvUvO3s7aTzUSHlBOTMqZvhUU/kM\nYhbjcO9huvq6+pfOOWZPmM28qnnMnTiXeRPnMbtyNl29Xby952027NnA+pb1bNizgU0HNlFZWElN\neQ01ZUEpr6G6pJrKokomFE6gstAvKworyLPho1LmALj5wGY2HdjEloNbaOls4UDXAQ52H/TLroP0\npHq4qPYirpx1JVfOvpJ5E+dlTU8nnfZBfSTVcc6nmQ4eHFoOH/bppkTCL/PzfS/hwAHfsxhctm/3\nB5yamqEHgI6OgQNQXp4fZJ8713/mnXd8Wu2MM3zQnzbNH0iSyYHS0zMwWJ/pnaRSPn03d64fw5k/\n35e6upEdNLq7/YwxM5g+3bdJTi0F+tNE2qXZ07mHnW072dG2g52HduKcoyhRRHGimKJ4EUWJIpxz\nbDm4hU0HNrHxwEY2HdjEzkM7SeQlOGvyWSysXsg5U87hnOpzmDdxHq3drf2pqcZDjTS2N9LS2UJr\ndyut3a0c7DrIwe6DdPR0UBT335UpJfkl9KX72HJwC3mWx9yJc6mvrKe+sp5pZdOoLKyksqiSiUUT\n+9NXr+x4hTXb1rBm2xrSLs3iWYtZMGlBf6orU4riRfSkevoPYod7D9PV20V+LJ/5VfOZXzWfmvKa\nYx58xoNMqmrnzoEDQHGxD+4LF0J19dGfaW/3D+rJBP2CgqElc3CJxwd6JplrRDZt8r2gjRv9cu9e\n39soK4PSUr8sK/P7yUwJ3r3bH8Cqq/3BcM8eP2ie6ZXU1vrPFhQMjAdlxoaKigZKcbFfmvkD5eCS\nSPiDVmXliaX6TicK9HJcvale8ixvVLOLUulUf9Dt7OnsD75mxpzKOUwsmnhC+8sckNZsW8OWg1v6\n010dvR39+8+P5fuDWKKI4rhfdvV2sfHARj7Y/wGHkoeYN3Ee86vmM6l40pADRUmihIrCCqpLqvvH\nUMbrWElYurr84zvb2wdKR4c/g588eWAq8OAA3Nvrx0927vSlsdEPvA+e4ZUpXV0D5fBhv3RuoOeU\nKT09Pk2VTPoeQ2Y21uTJ/rsnTBhYTpgwtNeUSAxMPshczV5QMPSA0d3te1wHDvhlMun3PWWKH9sZ\nDz0UBXo5bR1KHmLj/o18uP/D/l7H4DGS1mQrLR0t/eMoh3sPM6VkCpWFlUf1IErzSynLLxsyiF6a\nX0pBrKB/rAToX0/EEuTH8imIFfhlvICCWAEl+SWUJEooyS85odtqp9IpDvceJpYXoyheNCYHpLRL\ns711O2/veZsDXQe4eMbFWZUuO1Jn59DZWPv2QWurD86ZZVvb0DGQ3l5fursHZo719Q30IDo6fOqq\nsnJgTCU/3+97zx4f/CsqfNAvKxvaC8mUTO9kcE+ls9Mf8Bob/bKpyfeQqqoGpgxnphBnpg9nthWd\nxFCbAr3ICHX3ddPS0UJbso32ZHv/gaGjp4P2niNeJ9vp6O0g2ZfE4X+zmd9u2qXpS/eRTCXpSfWQ\n7EsOSTNlDjZ5ltcf8ON5cRJ5CeJ58f4B90zvqLO3k2RfkuJEMSmXIpVOUVFY0T9IX1FYQXlBOWX5\nZb4UlFFeUE5hvLB/LCdTv5RLsfnAZjbs2cA7e99hQuEEPjblY1QUVPDKzlfoTfVy+czL+8ucyjmU\nJEqyNvifjL6+gR5EJngfq3mplA/2LS3+oDC4F5Ipw/VSiot9EJ8+faAXMmmS7yFlpgxnlpn0XGZb\nWZl/fzw+cKDKlLPPhqeeOrqeCvQiWcg5R0+qh87eTnpSPfSl+4aUtEv7cY7g7H/wWXyyL9k/UN+W\n9NN925PtHEoeor2nvX+9u697yAwtgDzLY/aE2ZxTfQ5nTz6byqLKIXXa3radl7e/zMvbX2btjrXs\nOrSL7r5uShIllBX4A0lJfsmwYx/OOdIuPaQATCqexPSy6UNKeUE5Xb1dAxMFgvXygvL+mWdTS6cy\ntXRq/2yyVDrlD5zBATRmMYoTxRTGC3PmQJRO+zP/xkZ/kDkyBVVc7HsXR1KgF5FRSaVT/b2a9mQ7\nnb2dRz33ISOWFyPP8sizPAzD4dh3eF//dSbN7c00dTRxKHmof/A+syyMF9KWbOufjpwpQP/BryDu\nU2H5sXzSLk1njz9QZiYdFCeKj0q5lSRKKIgX9KfXMoZLsWVKIpYY0stKxBIUxAqOmtxQGC8cMtXZ\nzMizPLr7uodcc9PW3UZ7TzvFiWIqCnwPrLygvL83VlFQQUVhxUem5XpTvfSmeylOHH1PEQV6ERm3\nnHN09fnZVDGLDRsEj5wI0NnbOWQspqOng2QqedTn+lNsQWot01PI9LB6U71+mfYBNtmXPKoH0t3X\nPWSqs8P3agrjhf3BO3MtTGl+KV29XbQl2ziUPDSwDHpmbd1tpF26P/gPbldXbxcAl8y4hJdvevmo\ntijQi4iME4PTcrG82JAeTyJ27EvOFehFRHJc1jwc3Mw+Y2bvm9mHZvaNML5DRERGZswDvZnlAf8d\n+DRwNrDczM4c6+/Jdg0NDVFXIVS53L5cbhuofaejMM7oFwEbnXPbnXO9wOPANSF8T1bL9R9bLrcv\nl9sGat+iwGfQAAAEF0lEQVTpKIxAXwPsHPR6V7BNREQiEEagH26QQKOuIiIRGfNZN2Z2EbDKOfeZ\n4PUKwDnn7jrifQr+IiInIfLplWYWAz4APgk0A68Dy51z743pF4mIyIiM+U05nXMpM/tb4Hl8augB\nBXkRkehEdsGUiIicGpE8kifXLqgyswfMrMXM1g/aVmlmz5vZB2b2nJlVRFnHk2VmtWa22szeNbMN\nZvbVYHuutK/AzF4zs3VB+1YG22eZ2atB+x4zs3HwSIrhmVmemf3OzJ4MXudM2wDMbJuZ/T74N3w9\n2JYrv88KM/t3M3vPzN4xs0+cTNtOeaDP0QuqfoRvz2ArgF85584AVgPfPOW1Ght9wNedc2cBFwO3\nBP9eOdE+51wSuNI593HgPGCpmX0CuAu4O2hfK/DlCKs5Wl8D3h30OpfaBpAGFjvnPu6cWxRsy4nf\nJ/AvwNPOuQXAucD7nEzbnHOntAAXAc8Mer0C+MaprkcI7ZoJrB/0+n2gOlifCrwfdR3HqJ1PAJ/K\nxfYBxcAb+Iv+9gB5wfaLgGejrt9JtqkWeAFYDDwZbNubC20b1MatQNUR28b97xMoAzYPs/2E2xZF\n6uZ0uaBqinOuBcA5txuYHHF9Rs3MZuHPel/F/9Byon1BamMdsBsfFDcDrc4FT9Pwv9HpUdVvlP4Z\nuI3gWhYzqwIO5kjbMhzwnJn91sz+KtiWC7/POcA+M/tRkHr7oZkVcxJtiyLQ64KqccjMSoGfAV9z\nznWQQ/9mzrm086mbWvzZ/ILh3nZqazV6ZvZHQItz7i0G/t8ZR/8fHHdtO8IlzrkLgc/iU4uXMf7b\nBH5W5PnAfc6584FOfAbkhNsWRaDfBdQNel0LNEVQj7C1mFk1gJlNxacCxqVgsO5nwMPOuV8Em3Om\nfRnOuUPAS/h0xoRgPAnG72/0UuBqM9sCPAb8IXAPUJEDbesXnNXinNuLTy0uIjd+n7uAnc65N4LX\n/4EP/CfctigC/W+BuWY208zygeuAJyOox1g78kzpSeBLwfoXgV8c+YFx5H8B7zrn/mXQtpxon5lN\nysxaMLMi/PjDu8AaYFnwtnHZPufct5xzdc65Ofj/Z6udc58nB9qWYWbFQW8TMysBlgAbyIHfZ5Ce\n2Wlm84NNnwTe4STaFsk8ejP7DH40OXNB1fdOeSXGkJk9ih/sqgJagJX4M4t/B2YAO4BlzrnWqOp4\nsszsUuBl/H8eF5Rv4a94/injv33nAA/if4t5wE+cc//VzGbj77xaCawDPu/83VjHJTO7AvjPzrmr\nc6ltQVt+jv9dxoFHnHPfM7OJ5Mbv81zg34AEsAW4CYhxgm3TBVMiIjkukgumRETk1FGgFxHJcQr0\nIiI5ToFeRCTHKdCLiOQ4BXoRkRynQC8ikuMU6EVEctz/B1zALLXFjbHGAAAAAElFTkSuQmCC\n", | |
"text/plain": [ | |
"<matplotlib.figure.Figure at 0x10efbb550>" | |
] | |
}, | |
"metadata": {}, | |
"output_type": "display_data" | |
} | |
], | |
"source": [ | |
"plt.plot(history.history['val_loss'], label='val_loss')\n", | |
"plt.plot(history.history['loss'], label='loss')\n", | |
"plt.legend()" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 14, | |
"metadata": { | |
"scrolled": true | |
}, | |
"outputs": [ | |
{ | |
"name": "stdout", | |
"output_type": "stream", | |
"text": [ | |
"\r", | |
" 32/102 [========>.....................] - ETA: 0s" | |
] | |
}, | |
{ | |
"data": { | |
"text/plain": [ | |
"12.41510062124215" | |
] | |
}, | |
"execution_count": 14, | |
"metadata": {}, | |
"output_type": "execute_result" | |
} | |
], | |
"source": [ | |
"# now I can evaluate what my \n", | |
"net.evaluate(X_test, y_test)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Though not a formal test, let's see how the output distribution looks between the true `y_test` and the $\\hat{y}$ from our model." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 15, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"yhat = net.predict(X_test)" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 16, | |
"metadata": {}, | |
"outputs": [ | |
{ | |
"data": { | |
"text/plain": [ | |
"<matplotlib.legend.Legend at 0x10eef7350>" | |
] | |
}, | |
"execution_count": 16, | |
"metadata": {}, | |
"output_type": "execute_result" | |
}, | |
{ | |
"data": { | |
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAEACAYAAACj0I2EAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAFONJREFUeJzt3X+QVfWZ5/H3A+gGRQmyDhayNDGDbqIGdNVNoju5q6hU\nUpbumETR2RBmy53JusFx3agzkyq6p7Zco6szOD+2JhOlyCQwE407IanN6ibkuuViEH8gJIqMkwER\nAokRULJB+fHsH32Ftu2m+/46tzn9flXd8va5p8/zcDx8+svp7zknMhNJ0tFvTKcbkCS1hoEuSSVh\noEtSSRjoklQSBroklYSBLkklMWSgR8T9EbEjItb1WTYrIp6IiGcj4smIOK+9bUqShjKcEfoS4PJ+\ny+4CFmXmOcAi4O5WNyZJqs+QgZ6ZjwM7+y0+CEysvX8vsLXFfUmS6jSuwe+7GXgkIu4BAvho61qS\nJDWi0V+Kfg64KTOn0xvuD7SuJUlSI2I493KJiC7g25n5odrXuzLzvX0+352ZEwf5Xm8WI0kNyMyo\nZ/3hjtCj9nrb1oj4GEBEXAJsHKIpX5ksWrSo4z2MlNeR9sU55yRPP11sP1/8YvJHfzTy9sVoe7kv\nDr8aMeQ59IhYBlSAyRHxMr2zWm4A7ouIscBe4N83VF2S1DJDBnpmXjfIR849l6QRxCtFC1SpVDrd\nwojhvjjMfXGY+6I5BnqBPFgPc18c5r44zH3RnEbnoUtS02bMmMHmzZs73UZHdXV1sWnTppZsy0CX\n1DGbN29ueEZHWUTUNTPxiDzlIkklYaBLUkkY6JJUEga6JJWEgS5JQ9i7dy979+7tdBtDMtAljSgz\nZkBE+14zZtTXz8GDB1m0aBHd3d0jfkaO0xYljSibN0M7c7PeWYLVapVbbrmFzOQHP/gBF198cXsa\nawEDXZKOoG+AT5kypYOdDM1TLpI0gFdeeYWHH36YefPmAbBv3z4uvfTSDnd1ZAa6JA1gw4YNXHDB\nBWzbtg2AJ554ghn1noAvmIEuSQOYM2cOS5cu5frrrwfg+9//PpdddlmHuzoyA12SBrF69Wouuugi\nAFauXMmcOXM63NGRDRnoEXF/ROyIiHX9ln8+IjZExPqIuLN9LUpSZ1x11VV85zvf4b777mP//v1M\nmjSp0y0d0XBG6EuAy/suiIgKcAVwVmaeDfy31rcmaTTq6mrvPPSuruH1sXLlSl566SVuvfVWdu7c\nycKFC9v7B2+B4TyC7vGI6L8LPgfcmZn7a+u82o7mJI0+Lbo1eNMmT57MzJkz+drXvsYZZ5zBtdde\n2+mWhtToPPTTgd+IiDuAXwFfyMynWteWBvLTN35KMrKvVGuF/cceC/zTTrehUW7WrFnMmjWr023U\npdFAHwe8NzM/HBHnA98AThts5e7u7kPvK5WKj5lqwIoXV3DNQ9cw6T0j+xxeK2y/dBdbfrmec3l/\np1uRClOtVqlWq01to9FA3wI8DJCZayLiYERMzsxfDLRy30BXY179f68y76x5PHDlA51upe2Ou/lc\n9uzb3ek2pEL1H+z29PTUvY3hTluM2uttfwdcAhARpwPHDBbmkqRiDDlCj4hlQAWYHBEvA4uAB4Al\nEbEeeBP4TDublCQNbTizXK4b5KN/2+JeJElN8EpRSSoJA12SSsJAl6SSMNAlqSR8YpGkEWXGn8xg\n8+7Nbdt+18QuNv3eprZtv5MMdEkjyubdm8lF7bvFRfTU+VDRo4inXCSpJAx0SRqAzxSVpJLwmaKS\nVBIDPVN05syZ3HzzzR3ubHAGuiQNov8zRd98801WrVrFkiVLWLp0KZ/+9KdZvHgxS5cuZcWKFeze\nvZsFCxYAsHHjRm644Qbmz5/P9u3bC+nXQJekQfR/puj8+fO58MILD4X2rFmzuOmmm97xPRG9s2ju\nvvtupk2bxtSpU3n66acL6ddpi5JGlK6JXW2dWtg1cXgPFX37maJ33HEHPT09LFy4kIjgwIEDh9Y5\n++yzgcMh3vezffv2sXDhQiZNmkRmMU8aM9AljSgj5aKfgZ4pun//flatWsXixYuZNOnw08NmzpzJ\nvffey7p16w4tu+2227jxxhuZPHkyn/zkJ/nYxz7W9p4NdEkawEDPFB03bhxr1qx517of+chHePDB\nB9+x7AMf+ADLli1ra4/9eQ5dkkpiyECPiPsjYkdErBvgs/9ce57oSe1pT5I0XMMZoS8BLu+/MCKm\nAXOA9t1FR5I0bEMGemY+Duwc4KM/Br7Q8o4kSQ1p6JeiEXEFsCUz1789XUftdeAAbNkC3/52sXVP\nOw3OPLPYmpIaU3egR8R44A+BvnepOWKqd3d3H3pfqVSoVCr1lh31Vq+GVavgPY8UV3PvXtiwofcH\niaT2qlarVKvVprbRyAj9/cAM4LnoHZ5PA56OiAsy82cDfUPfQFdjDhyAU0+Fb99dXM3t22H27OLq\nafTp6upitP8rv6ur90Kn/oPdnp6eurc13ECP2ovM/BFwyqEPIv4RODczBzrPLkmD2rRpU6dbKJXh\nTFtcBqwCTo+IlyNiQb9VkiFOuUiS2m/IEXpmXjfE56e1rh1JUqO8UlSSSsJAl6SSMNAlqSQMdEkq\nCQNdkkrCQJekkjDQJakkfGKRBGw58Rt89We/R/etxdY9c8zVrLvzT4stqtIy0CVg2rnr+Z3Xr+eG\ns24urOY3f/gU9z3zXwurp/Iz0CUgAqaedCLnzpxaWM2nXjq5sFoaHTyHLkklYaBLUkkY6JJUEga6\nJJWEgS5JJWGgS1JJDOeJRfdHxI6IWNdn2V0R8UJErI2Ib0bEie1tU5I0lOGM0JcAl/db9ihwZmbO\nBv4e+P1WNyZJqs+QgZ6ZjwM7+y37XmYerH35Q2BaG3qTJNWhFVeK/jbwNy3YzlFj9Sur+e5L3y20\n5tpfPQtMLrSmpKNLU4EeEX8I7MvMZUdar7u7+9D7SqVCpVJppmzH3bXqLsbGWD548gcLqzn92HOI\n1/9NYfUkFatarVKtVpvaRsOBHhHzgY8DFw+1bt9AL4trzryGqz94dWH1HvgJPL6vsHKSCtZ/sNvT\n01P3NoYb6FF79X4RMRe4FfiNzHyz7qqSpJYbzrTFZcAq4PSIeDkiFgB/CkwA/ndEPBMRf9HmPiVJ\nQxhyhJ6Z1w2weEkbepEkNcErRSWpJAx0SSoJA12SSsJAl6SSMNAlqSQMdEkqiVbcy2XUOXgQXtsJ\n27YVV3PnzqHXaYcDB4r9cwJkws63fs62N4or/MZbbzB5fPH3ysmDxe/f8eNh0qRia6oYBnoD1q6F\nR++BRVuLrXv77cXWO+EEmDoVzjuv2LrHXnwe/2X9Au74cbF1//zjf15ovZNPhv0Hit+/v/wlbN0K\nEyYUW1ftZ6A34K234D98Du7+7U530l7HHw/PPdeJyl/uRNHCnTIFzjkHnvizYuuedFLvMazy8Ry6\nJJWEgS5JJWGgS1JJGOiSVBIGuiSVhIEuSSUxnAdc3B8ROyJiXZ9lkyLi0Yh4MSIeiYiJ7W1TkjSU\n4YzQlwCX91t2O/C9zDwDWAn8fqsbkyTVZ8hAz8zHgf4Xnl8JLK29Xwpc1eK+JEl1avQc+q9l5g6A\nzNwOnNy6liRJjfCXopJUEo3ey2VHREzJzB0RcQrwsyOt3N3dfeh9pVKhUqk0WFaSyqlarVKtVpva\nxnADPWqvt60APgt8CZgPfOtI39w30CVJ79Z/sNvT01P3NoYzbXEZsAo4PSJejogFwJ3ApRHxIjCn\n9rUkqYOGHKFn5nWDfDSnxb1IkprgL0UlqSQMdEkqCQNdkkrCQJekkjDQJakkDHRJKgkDXZJKotFL\n/yUdpQ5MXs//+sdNnPDz4mru3j2GCdsvZ2wUGznnnw+nnFJoyY4y0KVR5o3fnMNX1s3m+PccW1jN\nlc8/w8nP3svZcU1hNbduhbPOgq9+tbCSHWegS6PNmH381eXLef/Ukwor+b5bfou5n9jHf/9cYSVZ\nvhxWrCiu3kjgOXRJKgkDXZJKwkCXpJIw0CWpJAx0SSoJA12SSqKpQI+ImyPiRxGxLiK+HhHFTWyV\nJL1Dw4EeEVOBzwPnZuaH6J3Tfm2rGpMk1afZC4vGAsdHxEHgOGBb8y1JkhrR8Ag9M7cB9wAvA1uB\nXZn5vVY1JkmqTzOnXN4LXAl0AVOBCREx2AOlJUlt1swplznATzLzNYCIeBj4KLCs/4rd3d2H3lcq\nFSqVShNlJal8qtUq1Wq1qW00E+gvAx+OiPcAbwKXAGsGWrFvoEuS3q3/YLenp6fubTRzDv1J4CHg\nWeA5IIAvN7o9SVJzmprlkpk9QP0/RiRJLeeVopJUEga6JJWEgS5JJWGgS1JJGOiSVBIGuiSVhIEu\nSSXR7N0WJR2FFi+Gk8YXV2/XruJqjWYGujTKHHccjD9QbM3p02H27GJrjkYGujTKHHss3Law2BH6\nSw/D8ccXV2+08hy6JJWEgS5JJWGgS1JJGOiSVBIGuiSVhIEuSSXRVKBHxMSIeDAiXoiIH0fEv2xV\nY5Kk+jQ7D30x8D8z81MRMQ44rgU9SZIa0HCgR8QJwL/KzM8CZOZ+4PUW9SVJqlMzI/TTgFcjYgkw\nC3gKuCkzf9WSzobp9ddhz54iK8KBgi+bVnm9deAttr2xrdCaB/NgofVUnGYCfRxwLnBjZj4VEX8C\n3A4s6r9id3f3ofeVSoVKpdJE2Xc680zYvx8iWrbJIb1+GZxySnH1VE6nnngqu/bu4rwvn1do3fdN\neh/HHePZ0ZGmWq1SrVab2kYzgf4KsCUzn6p9/RBw20Ar9g30Vnv1VXjtNRhf4H0prv4GzJhRXD2V\n0/SJ0/mHhf/Q6TY0QvQf7Pb09NS9jYZnuWTmDmBLRJxeW3QJ8Hyj25MkNafZWS4Lga9HxDHAT4AF\nzbckSWpEU4Gemc8B57eoF0lSE7xSVJJKwkCXpJIw0CWpJAx0SSoJA12SSsJAl6SSMNAlqSSavbCo\n4/ZdfAtXf3MjY8cWV3PN1jXMnzW/uILSUe6YscdwzxP38Lc//tvCam7dCuNP+AzwqcJqdlpkZnsL\nRGQ7a8QXx/P1T/41J4z/J22r0d/YMWOZ++tzGRP+A0cajh17dvDk1icLrfmXjzzG+s1b2XzP8kLr\ntkpEkJl13XbwqB+hA8w97ROcdGKBd+eSVJcpE6ZwxRlXFFrz0eoe1rO10Jqd5hBTkkrCQJekkjDQ\nJakkDHRJKgkDXZJKwkCXpJJoOtAjYkxEPBMRK1rRkCSpMa0Yod+EzxKVpI5rKtAjYhrwceArrWlH\nktSoZkfofwx8AWjv/QMkSUNq+NL/iPgEsCMz10ZEBRj0ngPd3d2H3lcqFSqVSqNlJamUqtUq1Wq1\nqW00fHOuiLgD+C1gPzAeOAF4ODM/02+9tt+c6xe3vua9XCS9w+f/cjkrNq4YVTfnaviUS2b+QWZO\nz8zTgGuBlf3DXJJUHOehS1JJtOT2uZn5GPBYK7YlSWqMI3RJKgkDXZJKwkCXpJIw0CWpJAx0SSoJ\nA12SSsJAl6SSMNAlqSQMdEkqCQNdkkrCQJekkjDQJakkDHRJKgkDXZJKwkCXpJIw0CWpJBoO9IiY\nFhErI+L5iFgfEQtb2ZgkqT7NPLFoP/CfMnNtREwAno6IRzNzQ4t6kyTVoZmHRG/PzLW193uAF4BT\nW9WYJKk+LXmmaETMAGYDq1uxPUlqhV27oLu72JpnnAHz5hVb821NB3rtdMtDwE21kfq7dPfZo5VK\nhUql0mxZSTqiWbNh+nYgi6u5dy/87u82FujVapVqtdpU/chs/E8bEeOA7wDfzczFg6yTzdQYsocv\njucXt77GSSeOb1sNSUef5euXs2LjCpZfvbywmrt3w/Tpvf9tVkSQmVHP9zQ7bfEB4PnBwlySVJxm\npi1eCFwPXBwRz0bEMxExt3WtSZLq0fA59Mz8v8DYFvYiSWqCV4pKUkkY6JJUEga6JJWEgS5JJWGg\nS1JJGOiSVBIGuiSVhIEuSSVhoEtSSRjoklQSBroklYSBLkklYaBLUkkY6JJUEga6JJVEU4EeEXMj\nYkNEbIyI21rVlCSpfs08sWgM8GfA5cCZwLyI+OetaqyMmn0AbJm4Lw5zXxzmvmhOMyP0C4C/z8zN\nmbkP+Bvgyta0VU4erIe5Lw5zXxzmvmhOM4F+KrClz9ev1JZJkjqg4WeKAjHAsmxiew2ZsOccxowZ\nqBVJo9kxY4/hsU2PccXyKwqruX8/TJvz74CrCqvZV2Q2lsER8WGgOzPn1r6+HcjM/FK/9QoPeUkq\ng8ysa7TaTKCPBV4ELgF+CjwJzMvMFxraoCSpKQ2fcsnMAxHxH4FH6T0Xf79hLkmd0/AIXZI0srTt\nSlEvOnqniNgUEc9FxLMR8WSn+ylSRNwfETsiYl2fZZMi4tGIeDEiHomIiZ3ssSiD7ItFEfFKRDxT\ne83tZI9FiIhpEbEyIp6PiPURsbC2fNQdFwPsi8/Xltd9XLRlhF676GgjvefXtwFrgGszc0PLix0l\nIuInwL/IzJ2d7qVoEXERsAf4amZ+qLbsS8AvMvOu2g/8SZl5eyf7LMIg+2IR8EZm3tvR5goUEacA\np2Tm2oiYADxN73UsCxhlx8UR9sU11HlctGuE7kVH7xaM0nvnZObjQP8fZFcCS2vvl9KpeV4FG2Rf\nwMDTgEsrM7dn5tra+z3AC8A0RuFxMci+ePuanrqOi3YFjBcdvVsCj0TEmoi4odPNjAC/lpk7oPeA\nBk7ucD+ddmNErI2Ir4yG0wx9RcQMYDbwQ2DKaD4u+uyL1bVFdR0X7Qr0EXHR0Qjz0cw8D/g4vf+T\nLup0Qxox/gJ4f2bOBrYDo+nUywTgIeCm2uh01ObEAPui7uOiXYH+CjC9z9fT6D2XPmrVRhtk5s+B\n/0HvaanRbEdETIFD5xB/1uF+OiYzf56Hf5n1V8D5neynKBExjt4A++vM/FZt8ag8LgbaF40cF+0K\n9DXAr0dEV0QcC1wLrGhTrREvIo6r/fQlIo4HLgN+1NmuChe8819uK4DP1t7PB77V/xtK7B37ohZc\nb/tNRs+x8QDwfGYu7rNstB4X79oXjRwXbZuHXptis5jDFx3d2ZZCR4GIeB+9o/Kk92Kur4+m/RER\ny4AKMBnYASwC/g54EPhnwMvApzJzV6d6LMog++Jf03ve9CCwCfidt88jl1VEXAj8H2A9vX8vEvgD\neq84/waj6Lg4wr64jjqPCy8skqSSGJXT6CSpjAx0SSoJA12SSsJAl6SSMNAlqSQMdEkqCQNdkkrC\nQJekkvj/9NaHyJhiBlYAAAAASUVORK5CYII=\n", | |
"text/plain": [ | |
"<matplotlib.figure.Figure at 0x10e8c3950>" | |
] | |
}, | |
"metadata": {}, | |
"output_type": "display_data" | |
} | |
], | |
"source": [ | |
"_, bins, _ = plt.hist(yhat, histtype='step', label=r'$\\hat{y}$')\n", | |
"plt.hist(y_test, bins=bins, histtype='step', label=r'$y_{\\mathsf{true}}$')\n", | |
"plt.legend()" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"## Ok, so?\n", | |
"\n", | |
"So far, we've done nothing that you can't do in TMVA (maybe besides the Adam optimizer), so why use Keras? You gain the ability to work with things like sequence data, images, and although maybe not necessary for the Boston dataset, you are able to compose arbitrary graphs!\n", | |
"\n", | |
"Let's quickly show how we can create a **residual** model with dropout for our boston housing model.\n", | |
"\n", | |
"The basis of a residual block is the idea that learning the residual components beyond the identity function decouples some of the difficulties in learning complex funcionals. The idea is instead of having $z = f(x)$, we want $z = x + f(x)$ where our $f$ is now tasked with learning a residual on the data" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 17, | |
"metadata": {}, | |
"outputs": [], | |
"source": [ | |
"from keras.layers import Dropout, add" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 18, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"# we define the input shape (i.e., how many input features) **without** the batch size\n", | |
"x = Input(shape=(13, ))\n", | |
"\n", | |
"# we want this layer to be normal, but skip into a layer downstream\n", | |
"skip = Dense(20)(x)\n", | |
"h = Dropout(0.5)(skip)\n", | |
"h = Activation('relu')(h)\n", | |
"\n", | |
"h = Dense(20)(h)\n", | |
"skip = add([h, skip])\n", | |
"h = Dropout(0.5)(skip)\n", | |
"h = Activation('relu')(h)\n", | |
"\n", | |
"h = Dense(20)(h)\n", | |
"h = add([h, skip])\n", | |
"h = Dropout(0.5)(h)\n", | |
"h = Activation('relu')(h)\n", | |
"\n", | |
"# our output is a single number, the house price.\n", | |
"y = Dense(1)(h)\n", | |
"\n", | |
"# A model is a conta\n", | |
"resnet = Model(x, y)" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 19, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"plot_model(resnet, to_file='arch.png')" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"![arch](arch.png)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Resduals, and the ability to construct complicated graph structures, allows us to create powerful models for semi-structured data." | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"More generally, keras allows you to combine parts of a graph using different ways of merging (like adding, cosine similarity, concatenation, multiplication, etc.)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"## Dealing with Sequences\n", | |
"\n", | |
"Most applications of sequence / recurrent models around grounded in natural language processing or stock price analysis. Let's use a hypothetical example that is grounded in Physics.\n", | |
"\n", | |
"To ground this problem, let's say we're building a network for an analysis that reads in all jets and all photons from two different streams. There are arbitrary numbers of jets and photons, so we need a model that can handle a sequence.\n", | |
"\n", | |
"Let's have the following two constraints on our jets and photons:\n", | |
"\n", | |
"* We have a maximum of 8 jets with 6 features\n", | |
"* We have a maximum of 2 photons with 11 features\n", | |
"\n", | |
"We can order our physics object by some value, say $p_T$, and construct a recurrent neural network, in particular a Bidirectional LSTM, to learn a function that maps this to signal/background." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 20, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"from keras.layers import LSTM, concatenate, Bidirectional" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 21, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"# you wound need to structure your input into this format such that it has shape (nb_samples, nb_objects, nb_features)\n", | |
"n_jets = 8\n", | |
"n_jet_feats = 6\n", | |
"\n", | |
"n_photons = 2\n", | |
"n_photon_feats = 11" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 22, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"jets = Input(shape=(n_jets, n_jet_feats), name='jets')\n", | |
"photons = Input(shape=(n_photons, n_photon_feats), name='photons')" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 23, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"features = concatenate([\n", | |
" Bidirectional(LSTM(10, name='jet_lstm'))(jets), \n", | |
" Bidirectional(LSTM(10, name='photon_lstm'))(photons), \n", | |
"])" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 24, | |
"metadata": {}, | |
"outputs": [], | |
"source": [ | |
"y = Activation('sigmoid', name='sigmoid')(Dense(1, name='logistic')(features))" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 25, | |
"metadata": {}, | |
"outputs": [], | |
"source": [ | |
"rnn = Model([jets, photons], y)" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 26, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"plot_model(rnn, to_file='arnn.png')" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"![arnn](./arnn.png)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"This is a very complicated deep model that has been defined in ~10 lines of code, which is very powerful. Consult **Michela's talk from Tuesday** for ways to build your ML data-pipeline to construct good training workflows." | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"## Where to go from here?" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Keras can be as simple or as complex as you like. I recommend looking at the [Keras Examples](https://github.com/fchollet/keras/tree/master/examples) for implementation inspiration and the [Keras-Resources](https://github.com/fchollet/keras-resources/blob/master/README.md) repository for existing projects and [other talks](https://github.com/fchollet/keras-resources/blob/master/README.md#tutorials) that do a great job at going in depth. " | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": null, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [] | |
} | |
], | |
"metadata": { | |
"kernelspec": { | |
"display_name": "Python 2", | |
"language": "python", | |
"name": "python2" | |
}, | |
"language_info": { | |
"codemirror_mode": { | |
"name": "ipython", | |
"version": 2 | |
}, | |
"file_extension": ".py", | |
"mimetype": "text/x-python", | |
"name": "python", | |
"nbconvert_exporter": "python", | |
"pygments_lexer": "ipython2", | |
"version": "2.7.13" | |
} | |
}, | |
"nbformat": 4, | |
"nbformat_minor": 2 | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment