Skip to content

Instantly share code, notes, and snippets.

@stevenhao
Created February 5, 2017 23:55
Show Gist options
  • Save stevenhao/883857f7753173d1fec9719f8d90d9e1 to your computer and use it in GitHub Desktop.
Save stevenhao/883857f7753173d1fec9719f8d90d9e1 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Deep Learning with PyTorch: A 60-minute Blitz\n",
"\n",
"Goal of this tutorial:\n",
"\n",
"- Understand PyTorch's Tensor library and neural networks at a high level.\n",
"- Train a small neural network to classify images\n",
"\n",
"*This tutorial assumes that you have a basic familiarity of numpy*\n",
"\n",
"\n",
"**Note:** Make sure you have the [torch](https://github.com/pytorch/pytorch) and [torchvision](https://github.com/pytorch/vision) packages installed.\n",
"\n",
"\n",
"### What is PyTorch?\n",
"\n",
"It's a Python based scientific computing package targeted at two sets of audiences:\n",
"\n",
"- A replacement for numpy to use the power of GPUs\n",
"- a deep learning research platform that provides maximum flexibility and speed\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Getting Started\n",
"\n",
"#### Tensors\n",
"Tensors are similar to numpy's ndarrays, with the addition being that Tensors can also be used on a GPU to accelerate computing."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"from __future__ import print_function\n",
"import torch"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"x = torch.Tensor(5, 3) # construct a 5x3 matrix, uninitialized"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"x = torch.rand(5, 3) # construct a randomly initialized matrix"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 0.2557 0.2978 0.4759\n",
" 0.8370 0.3197 0.7278\n",
" 0.8187 0.4259 0.5563\n",
" 0.3813 0.5840 0.1177\n",
" 0.6367 0.4184 0.3419\n",
"[torch.FloatTensor of size 5x3]"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"torch.Size([5, 3])"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x.size()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"*NOTE: `torch.Size` is in fact a tuple, so it supports the same operations*"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"y = torch.rand(5, 3)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 0.9575 1.1370 0.8067\n",
" 1.2351 0.7192 1.0788\n",
" 1.2892 1.0807 0.7792\n",
" 0.8983 0.6259 0.3314\n",
" 0.9066 1.2995 0.7470\n",
"[torch.FloatTensor of size 5x3]"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# addition: syntax 1\n",
"x + y"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 0.9575 1.1370 0.8067\n",
" 1.2351 0.7192 1.0788\n",
" 1.2892 1.0807 0.7792\n",
" 0.8983 0.6259 0.3314\n",
" 0.9066 1.2995 0.7470\n",
"[torch.FloatTensor of size 5x3]"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# addition: syntax 2\n",
"torch.add(x, y)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 0.9575 1.1370 0.8067\n",
" 1.2351 0.7192 1.0788\n",
" 1.2892 1.0807 0.7792\n",
" 0.8983 0.6259 0.3314\n",
" 0.9066 1.2995 0.7470\n",
"[torch.FloatTensor of size 5x3]"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# addition: giving an output tensor\n",
"result = torch.Tensor(5, 3)\n",
"torch.add(x, y, out=result)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 0.9575 1.1370 0.8067\n",
" 1.2351 0.7192 1.0788\n",
" 1.2892 1.0807 0.7792\n",
" 0.8983 0.6259 0.3314\n",
" 0.9066 1.2995 0.7470\n",
"[torch.FloatTensor of size 5x3]"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# addition: in-place\n",
"y.add_(x) # adds x to y"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"> **Note: ** Any operation that mutates a tensor in-place is post-fixed with an `_`\n",
"> \n",
"> For example: `x.copy_(y)`, `x.t_()`, will change `x`."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 0.2978\n",
" 0.3197\n",
" 0.4259\n",
" 0.5840\n",
" 0.4184\n",
"[torch.FloatTensor of size 5]"
]
},
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# standard numpy-like indexing with all bells and whistles\n",
"x[:,1]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Read later:\n",
"\n",
"100+ Tensor operations, including transposing, indexing, slicing, \n",
"mathematical operations, linear algebra, random numbers, etc.\n",
"\n",
"http://pytorch.org/docs/torch.html"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Numpy Bridge\n",
"\n",
"Converting a torch Tensor to a numpy array and vice versa is a breeze.\n",
"\n",
"The torch Tensor and numpy array will share their underlying memory locations, and changing one will change the other.\n",
"\n",
"#### Converting torch Tensor to numpy Array"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 1\n",
" 1\n",
" 1\n",
" 1\n",
" 1\n",
"[torch.FloatTensor of size 5]"
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"a = torch.ones(5)\n",
"a"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"array([ 1., 1., 1., 1., 1.], dtype=float32)"
]
},
"execution_count": 13,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"b = a.numpy()\n",
"b"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
" 2\n",
" 2\n",
" 2\n",
" 2\n",
" 2\n",
"[torch.FloatTensor of size 5]\n",
"\n",
"[ 2. 2. 2. 2. 2.]\n"
]
}
],
"source": [
"a.add_(1)\n",
"print(a)\n",
"print(b) # see how the numpy array changed in value"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Converting numpy Array to torch Tensor"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[ 2. 2. 2. 2. 2.]\n",
"\n",
" 2\n",
" 2\n",
" 2\n",
" 2\n",
" 2\n",
"[torch.DoubleTensor of size 5]\n",
"\n"
]
}
],
"source": [
"import numpy as np\n",
"a = np.ones(5)\n",
"b = torch.from_numpy(a)\n",
"np.add(a, 1, out=a)\n",
"print(a)\n",
"print(b) # see how changing the np array changed the torch Tensor automatically"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"All the Tensors on the CPU except a CharTensor support converting to NumPy and back.\n",
"\n",
"### CUDA Tensors\n",
"\n",
"Tensors can be moved onto GPU using the `.cuda` function."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"# let us run this cell only if CUDA is available\n",
"if torch.cuda.is_available():\n",
" x = x.cuda()\n",
" y = y.cuda()\n",
" x + y"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"**Next: Neural Networks with PyTorch.**\n",
"\n",
"Central to all neural networks in PyTorch is the `autograd` package.\n",
"Let's first briefly visit this, and we will then go to training our first neural network.\n",
"\n",
"## Autograd: automatic differentiation\n",
"\n",
"The `autograd` package provides automatic differentiation for all operations on Tensors. \n",
"It is a define-by-run framework, which means that your backprop is defined by how your code is run, and that every single iteration can be different. \n",
"\n",
"Let us see this in more simple terms with some examples.\n",
"\n",
"`autograd.Variable` is the central class of the package. \n",
"It wraps a Tensor, and supports nearly all of operations defined on it. Once you finish your computation you can call `.backward()` and have all the gradients computed automatically.\n",
"\n",
"You can access the raw tensor through the `.data` attribute, while the gradient w.r.t. this variable is accumulated into `.grad`.\n",
"\n",
"![Variable](images/Variable.png)\n",
"\n",
"There's one more class which is very important for autograd implementation - a `Function`. \n",
"\n",
"`Variable` and `Function` are interconnected and build up an acyclic graph, that encodes a complete history of computation. Each variable has a `.creator` attribute that references a `Function` that has created the `Variable` (except for Variables created by the user - their `creator is None`).\n",
"\n",
"If you want to compute the derivatives, you can call `.backward()` on a `Variable`. \n",
"If `Variable` is a scalar (i.e. it holds a one element data), you don't need to specify any arguments to `backward()`, however if it has more elements, you need to specify a `grad_output` argument that is a tensor of matching shape.\n"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"from torch.autograd import Variable"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 1 1\n",
" 1 1\n",
"[torch.FloatTensor of size 2x2]"
]
},
"execution_count": 18,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x = Variable(torch.ones(2, 2), requires_grad = True)\n",
"x"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 3 3\n",
" 3 3\n",
"[torch.FloatTensor of size 2x2]"
]
},
"execution_count": 19,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"y = x + 2\n",
"y"
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"<torch.autograd._functions.basic_ops.AddConstant at 0x10d865d58>"
]
},
"execution_count": 20,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"y.creator\n",
"# y was created as a result of an operation, \n",
"# so it has a creator"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 27 27\n",
" 27 27\n",
"[torch.FloatTensor of size 2x2]"
]
},
"execution_count": 21,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"z = y * y * 3\n",
"z"
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 27\n",
"[torch.FloatTensor of size 1]"
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"out = z.mean()\n",
"out"
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# let's backprop now\n",
"out.backward()\n",
"\n",
"# out.backward() is equivalent to doing out.backward(torch.Tensor([1.0]))"
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 4.5000 4.5000\n",
" 4.5000 4.5000\n",
"[torch.FloatTensor of size 2x2]"
]
},
"execution_count": 24,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# print gradients d(out)/dx\n",
"x.grad"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You should have got a matrix of `4.5`.\n",
"Let's call the `out` *Variable* \"$o$\". \n",
"We have that $o = \\frac{1}{4}\\sum_i z_i$, $z_i = 3(x_i+2)^2$ and $z_i\\bigr\\rvert_{x_i=1} = 27$. Therefore, $\\frac{\\partial o}{\\partial x_i} = \\frac{3}{2}(x_i+2)$, hence $\\frac{\\partial o}{\\partial x_i}\\bigr\\rvert_{x_i=1} = \\frac{9}{2} = 4.5$."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"> **You can do many crazy things with autograd:**"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"x = torch.randn(3)\n",
"x = Variable(x, requires_grad = True)"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"y = x * 2\n",
"while y.data.norm() < 1000:\n",
" y = y * 2"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" -22.1936\n",
" 1078.9509\n",
" -341.4025\n",
"[torch.FloatTensor of size 3]"
]
},
"execution_count": 27,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"y"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"gradients = torch.FloatTensor([0.1, 1.0, 0.0001])\n",
"y.backward(gradients)"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 51.2000\n",
" 512.0000\n",
" 0.0512\n",
"[torch.FloatTensor of size 3]"
]
},
"execution_count": 29,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x.grad"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Read Later:\n",
"> You can read more documentation on `Variable` and `Function` here: [pytorch.org/docs/autograd.html](http://pytorch.org/docs/autograd.html)\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Neural Networks\n",
"Neural networks can be constructed using the `torch.nn` package.\n",
"\n",
"Now that you had a glimpse of `autograd`, `nn` depends on `autograd` to define models and differentiate them.\n",
"\n",
"An `nn.Module` contains layers, and a method `forward(input)`that returns the `output`.\n",
"\n",
"For example, look at this network that classfies digit images:\n",
"\n",
"![convnet](images/mnist.png)\n",
"\n",
"It is a simple feed-forward network.\n",
"It takes the input, feeds it through several layers one after the other, and then finally gives the output.\n",
"\n",
"A typical training procedure for a neural network is as follows:\n",
"- define the neural network that has some learnable parameters (or weights)\n",
"- iterate over a dataset of inputs:\n",
" - process input through network\n",
" - compute the loss (how far is the output from being correct)\n",
" - propagate gradients back into the network's parameters\n",
" - update the weights of the network\n",
" - typically using a simple update rule: `weight = weight + learning_rate * gradient`\n",
" \n",
"\n",
"Let's define this network:"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Net (\n",
" (conv1): Conv2d(1, 6, kernel_size=(5, 5), stride=(1, 1))\n",
" (conv2): Conv2d(6, 16, kernel_size=(5, 5), stride=(1, 1))\n",
" (fc1): Linear (400 -> 120)\n",
" (fc2): Linear (120 -> 84)\n",
" (fc3): Linear (84 -> 10)\n",
")"
]
},
"execution_count": 30,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"import torch.nn as nn\n",
"import torch.nn.functional as F\n",
"\n",
"class Net(nn.Module):\n",
" def __init__(self):\n",
" super(Net, self).__init__()\n",
" self.conv1 = nn.Conv2d(1, 6, 5) # 1 input image channel, 6 output channels, 5x5 square convolution kernel\n",
" self.conv2 = nn.Conv2d(6, 16, 5)\n",
" self.fc1 = nn.Linear(16*5*5, 120) # an affine operation: y = Wx + b\n",
" self.fc2 = nn.Linear(120, 84)\n",
" self.fc3 = nn.Linear(84, 10)\n",
"\n",
" def forward(self, x):\n",
" x = F.max_pool2d(F.relu(self.conv1(x)), (2, 2)) # Max pooling over a (2, 2) window\n",
" x = F.max_pool2d(F.relu(self.conv2(x)), 2) # If the size is a square you can only specify a single number\n",
" x = x.view(-1, self.num_flat_features(x))\n",
" x = F.relu(self.fc1(x))\n",
" x = F.relu(self.fc2(x))\n",
" x = self.fc3(x)\n",
" return x\n",
" \n",
" def num_flat_features(self, x):\n",
" size = x.size()[1:] # all dimensions except the batch dimension\n",
" num_features = 1\n",
" for s in size:\n",
" num_features *= s\n",
" return num_features\n",
"\n",
"net = Net()\n",
"net"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You just have to define the `forward` function, and the `backward` function (where gradients are computed) is automatically defined for you using `autograd`.\n",
"\n",
"You can use any of the Tensor operations in the `forward` function.\n",
"\n",
"The learnable parameters of a model are returned by `net.parameters()`"
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"10\n",
"torch.Size([6, 1, 5, 5])\n"
]
}
],
"source": [
"params = list(net.parameters())\n",
"print(len(params))\n",
"print(params[0].size()) # conv1's .weight"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The input to the forward is an `autograd.Variable`, and so is the output."
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
"-0.0379 -0.0750 0.0371 0.0750 -0.0246 -0.0014 0.1110 0.0519 0.0229 0.0302\n",
"[torch.FloatTensor of size 1x10]"
]
},
"execution_count": 32,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"input = Variable(torch.randn(1, 1, 32, 32))\n",
"out = net(input)\n",
"out"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"net.zero_grad() # zeroes the gradient buffers of all parameters\n",
"out.backward(torch.randn(1, 10)) # backprops with random gradients"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"> #### NOTE: `torch.nn` only supports mini-batches\n",
"The entire `torch.nn` package only supports inputs that are a mini-batch of samples, and not a single sample. \n",
"For example, `nn.Conv2d` will take in a 4D Tensor of `nSamples x nChannels x Height x Width`.\n",
"\n",
"> *If you have a single sample, just use `input.unsqueeze(0)` to add a fake batch dimension.*"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Recap of all the classes you've seen so far:\n",
"\n",
"* `torch.Tensor` - A **multi-dimensional array**.\n",
"* `autograd.Variable` - **Wraps a Tensor and records the history of operations** applied to it. Has the same API as a `Tensor`, with some additions like `backward()`. Also **holds the gradient** w.r.t. the tensor.\n",
"* `nn.Module` - Neural network module. **Convenient way of encapsulating parameters**, with helpers for moving them to GPU, exporting, loading, etc.\n",
"* `nn.Parameter` - A kind of Variable, that is **automatically registered as a parameter when assigned as an attribute to a `Module`**.\n",
"* `autograd.Function` - Implements **forward and backward definitions of an autograd operation**. Every `Variable` operation, creates at least a single `Function` node, that connects to functions that created a `Variable` and **encodes its history**.\n",
"\n",
"##### At this point, we covered:\n",
"- Defining a neural network\n",
"- Processing inputs and calling backward.\n",
"\n",
"##### Still Left:\n",
"- Computing the loss\n",
"- Updating the weights of the network\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": true
},
"source": [
"A loss function takes the (output, target) pair of inputs, and computes a value that estimates how far away the output is from the target.\n",
"\n",
"There are [several different loss functions under the nn package](http://pytorch.org/docs/nn.html#loss-functions).\n",
"\n",
"A simple loss is: `nn.MSELoss` which computes the mean-squared error between the input and the target.\n",
"\n",
"For example:"
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 38.1446\n",
"[torch.FloatTensor of size 1]"
]
},
"execution_count": 34,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"output = net(input)\n",
"target = Variable(torch.range(1, 10)) # a dummy target, for example\n",
"criterion = nn.MSELoss()\n",
"loss = criterion(output, target)\n",
"loss"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, if you follow `loss` in the backward direction, using it's `.creator` attribute, you will see a graph of computations that looks like this:\n",
"\n",
"```\n",
"input -> conv2d -> relu -> maxpool2d -> conv2d -> relu -> maxpool2d \n",
" -> view -> linear -> relu -> linear -> relu -> linear \n",
" -> MSELoss\n",
" -> loss\n",
"```\n",
"\n",
"So, when we call `loss.backward()`, the whole graph is differentiated w.r.t. the loss, and all Variables in the graph will have their `.grad` Variable accumulated with the gradient.\n",
" "
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"<torch.nn._functions.thnn.auto.MSELoss object at 0x10d8c7470>\n",
"<torch.nn._functions.linear.Linear object at 0x10d8c73d8>\n",
"<torch.nn._functions.thnn.auto.Threshold object at 0x10d8c7340>\n"
]
}
],
"source": [
"# For illustration, let us follow a few steps backward\n",
"print(loss.creator) # MSELoss\n",
"print(loss.creator.previous_functions[0][0]) # Linear\n",
"print(loss.creator.previous_functions[0][0].previous_functions[0][0]) # ReLU"
]
},
{
"cell_type": "code",
"execution_count": 36,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"conv1.bias.grad before backward\n",
"Variable containing:\n",
" 0\n",
" 0\n",
" 0\n",
" 0\n",
" 0\n",
" 0\n",
"[torch.FloatTensor of size 6]\n",
"\n",
"conv1.bias.grad after backward\n",
"Variable containing:\n",
"-0.0235\n",
"-0.0954\n",
"-0.0245\n",
" 0.0572\n",
"-0.1005\n",
" 0.1112\n",
"[torch.FloatTensor of size 6]\n",
"\n"
]
}
],
"source": [
"# now we shall call loss.backward(), and have a look at conv1's bias gradients before and after the backward.\n",
"net.zero_grad() # zeroes the gradient buffers of all parameters\n",
"print('conv1.bias.grad before backward')\n",
"print(net.conv1.bias.grad)\n",
"loss.backward()\n",
"print('conv1.bias.grad after backward')\n",
"print(net.conv1.bias.grad)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, we have seen how to use loss functions.\n",
"\n",
"##### Read Later:\n",
"\n",
"> The neural network package contains various modules and loss functions that form the building blocks of deep neural networks. A full list with documentation is here: http://pytorch.org/docs/nn.html\n",
"\n",
"\n",
"**The only thing left to learn is:**\n",
"- updating the weights of the network\n",
"\n",
"The simplest update rule used in practice is the Stochastic Gradient Descent (SGD):\n",
"> `weight = weight - learning_rate * gradient`\n",
"\n",
"We can implement this using simple python code:\n",
"\n",
"```python\n",
"learning_rate = 0.01\n",
"for f in net.parameters():\n",
" f.data.sub_(f.grad.data * learning_rate)\n",
"```\n",
"\n",
"However, as you use neural networks, you want to use various different update rules such as SGD, Nesterov-SGD, Adam, RMSProp, etc.\n",
"\n",
"To enable this, we built a small package: `torch.optim` that implements all these methods.\n",
"Using it is very simple:"
]
},
{
"cell_type": "code",
"execution_count": 73,
"metadata": {
"collapsed": false
},
"outputs": [
{
"ename": "RuntimeError",
"evalue": "Need input of dimension 4 and input.size[1] == 3 but got input to be of shape: [1 x 1 x 32 x 32] at /Users/soumith/anaconda/conda-bld/pytorch-0.1.7_1485439972367/work/torch/lib/THNN/generic/SpatialConvolutionMM.c:47",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mRuntimeError\u001b[0m Traceback (most recent call last)",
"\u001b[0;32m<ipython-input-73-507b3ce34449>\u001b[0m in \u001b[0;36m<module>\u001b[0;34m()\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0;31m# in your training loop:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 6\u001b[0m \u001b[0moptimizer\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mzero_grad\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;31m# zero the gradient buffers\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 7\u001b[0;31m \u001b[0moutput\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnet\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 8\u001b[0m \u001b[0mloss\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcriterion\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0moutput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtarget\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 9\u001b[0m \u001b[0mloss\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mbackward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/modules/module.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, *input, **kwargs)\u001b[0m\n\u001b[1;32m 208\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 209\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__call__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 210\u001b[0;31m \u001b[0mresult\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mforward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 211\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mhook\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_forward_hooks\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mvalues\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 212\u001b[0m \u001b[0mhook_result\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mhook\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mresult\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m<ipython-input-61-a4505decff7e>\u001b[0m in \u001b[0;36mforward\u001b[0;34m(self, x)\u001b[0m\n\u001b[1;32m 10\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 11\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mforward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 12\u001b[0;31m \u001b[0mx\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpool\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mF\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrelu\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconv1\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 13\u001b[0m \u001b[0mx\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpool\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mF\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrelu\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconv2\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 14\u001b[0m \u001b[0mx\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mview\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m16\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0;36m5\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0;36m5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/modules/module.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, *input, **kwargs)\u001b[0m\n\u001b[1;32m 208\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 209\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__call__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 210\u001b[0;31m \u001b[0mresult\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mforward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 211\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mhook\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_forward_hooks\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mvalues\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 212\u001b[0m \u001b[0mhook_result\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mhook\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mresult\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/modules/conv.py\u001b[0m in \u001b[0;36mforward\u001b[0;34m(self, input)\u001b[0m\n\u001b[1;32m 233\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mforward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 234\u001b[0m return F.conv2d(input, self.weight, self.bias, self.stride,\n\u001b[0;32m--> 235\u001b[0;31m self.padding, self.dilation, self.groups)\n\u001b[0m\u001b[1;32m 236\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 237\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/functional.py\u001b[0m in \u001b[0;36mconv2d\u001b[0;34m(input, weight, bias, stride, padding, dilation, groups)\u001b[0m\n\u001b[1;32m 35\u001b[0m f = ConvNd(_pair(stride), _pair(padding), _pair(dilation), False,\n\u001b[1;32m 36\u001b[0m _pair(0), groups)\n\u001b[0;32m---> 37\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbias\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mbias\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mNone\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 38\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 39\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/_functions/conv.py\u001b[0m in \u001b[0;36mforward\u001b[0;34m(self, input, weight, bias)\u001b[0m\n\u001b[1;32m 31\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mk\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m3\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 32\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0m_view4d\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 33\u001b[0;31m \u001b[0moutput\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_update_output\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbias\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 34\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mk\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m3\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 35\u001b[0m \u001b[0moutput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0m_view3d\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0moutput\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/_functions/conv.py\u001b[0m in \u001b[0;36m_update_output\u001b[0;34m(self, input, weight, bias)\u001b[0m\n\u001b[1;32m 86\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 87\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_bufs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mg\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgroups\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 88\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_thnn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'update_output'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbias\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 89\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 90\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_grad_input\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgrad_output\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/_functions/conv.py\u001b[0m in \u001b[0;36m_thnn\u001b[0;34m(self, fn_name, input, weight, *args)\u001b[0m\n\u001b[1;32m 145\u001b[0m \u001b[0mimpl\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0m_thnn_convs\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mthnn_class_name\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 146\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgroups\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 147\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mimpl\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mfn_name\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_bufs\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mweight\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 148\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 149\u001b[0m \u001b[0mres\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/_functions/conv.py\u001b[0m in \u001b[0;36mcall_update_output\u001b[0;34m(self, bufs, input, weight, bias)\u001b[0m\n\u001b[1;32m 223\u001b[0m \u001b[0margs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mparse_arguments\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marguments\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m5\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbufs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkernel_size\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 224\u001b[0m getattr(backend, fn.name)(backend.library_state, input, output, weight,\n\u001b[0;32m--> 225\u001b[0;31m bias, *args)\n\u001b[0m\u001b[1;32m 226\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0moutput\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 227\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mcall_update_output\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;31mRuntimeError\u001b[0m: Need input of dimension 4 and input.size[1] == 3 but got input to be of shape: [1 x 1 x 32 x 32] at /Users/soumith/anaconda/conda-bld/pytorch-0.1.7_1485439972367/work/torch/lib/THNN/generic/SpatialConvolutionMM.c:47"
]
}
],
"source": [
"import torch.optim as optim\n",
"# create your optimizer\n",
"optimizer = optim.SGD(net.parameters(), lr = 0.01)\n",
"\n",
"# in your training loop:\n",
"optimizer.zero_grad() # zero the gradient buffers\n",
"output = net(input)\n",
"loss = criterion(output, target)\n",
"loss.backward()\n",
"optimizer.step() # Does the update"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This is it.\n",
"\n",
"Now you might be thinking,\n",
"\n",
"### What about data?\n",
"Generally, when you have to deal with image, text, audio or video data, you can use standard python packages that load data into a numpy array. Then you can convert this array into a `torch.*Tensor`.\n",
"\n",
"- For images, packages such as Pillow, OpenCV are useful. \n",
"- For audio, packages such as scipy and librosa \n",
"- For text, either raw Python or Cython based loading, or NLTK and SpaCy are useful.\n",
"\n",
"Specifically for `vision`, we have created a package called `torchvision`, that \n",
"has data loaders for common datasets such as Imagenet, CIFAR10, MNIST, etc. and data transformers for images.\n",
"This provides a huge convenience and avoids writing boilerplate code.\n",
"\n",
"For this tutorial, we will use the CIFAR10 dataset. \n",
"It has the classes: 'airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck'.\n",
"The images in CIFAR-10 are of size 3x32x32, i.e. 3-channel color images of 32x32 pixels in size.\n",
"\n",
"![cifar10](images/cifar10.png)\n",
"\n",
"## Training an image classifier\n",
"\n",
"We will do the following steps in order:\n",
"\n",
"1. Load and normalizing the CIFAR10 training and test datasets using `torchvision`\n",
"1. Define a Convolution Neural Network\n",
"1. Define a loss function\n",
"1. Train the network on the training data\n",
"1. Test the network on the test data\n",
"\n",
"### 1. Loading and normalizing CIFAR10\n",
"\n",
"Using `torchvision`, it's extremely easy to load CIFAR10. "
]
},
{
"cell_type": "code",
"execution_count": 38,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import torchvision\n",
"import torchvision.transforms as transforms"
]
},
{
"cell_type": "code",
"execution_count": 39,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading http://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz to ./data/cifar-10-python.tar.gz\n",
"Extracting tar file\n",
"Done!\n",
"Files already downloaded and verified\n"
]
}
],
"source": [
"\n",
"# The output of torchvision datasets are PILImage images of range [0, 1].\n",
"# We transform them to Tensors of normalized range [-1, 1]\n",
"transform=transforms.Compose([transforms.ToTensor(),\n",
" transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)),\n",
" ])\n",
"trainset = torchvision.datasets.CIFAR10(root='./data', train=True, download=True, transform=transform)\n",
"trainloader = torch.utils.data.DataLoader(trainset, batch_size=4, \n",
" shuffle=True, num_workers=2)\n",
"\n",
"testset = torchvision.datasets.CIFAR10(root='./data', train=False, download=True, transform=transform)\n",
"testloader = torch.utils.data.DataLoader(testset, batch_size=4, \n",
" shuffle=False, num_workers=2)\n",
"classes = ('plane', 'car', 'bird', 'cat',\n",
" 'deer', 'dog', 'frog', 'horse', 'ship', 'truck')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"*Let us show some of the training images, for fun.*"
]
},
{
"cell_type": "code",
"execution_count": 41,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"# functions to show an image\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"%matplotlib inline\n",
"def imshow(img):\n",
" img = img / 2 + 0.5 # unnormalize\n",
" npimg = img.numpy()\n",
" plt.imshow(np.transpose(npimg, (1,2,0)))"
]
},
{
"cell_type": "code",
"execution_count": 42,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" cat car plane plane\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAB2CAYAAADY3GjsAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztfWmQXcd13tf37fPe7DtmBhjs4AICpECKWiNrsSjZWpyy\nZalsR0lUpajilO1ESSTFP+SkUrHLcdlxxY5TjLXZlijJsmxRiyXRlEiKpEgBJEWAJABiG8wMgNn3\n9+btnR/n9D1nMA+DGQLEYEb9VZF403fr7tv33nPOdxZjrYWHh4eHx8ZHsN4d8PDw8PC4PvAvdA8P\nD49NAv9C9/Dw8Ngk8C90Dw8Pj00C/0L38PDw2CTwL3QPDw+PTQL/Qvfw8PDYJLimF7ox5j5jzElj\nzGljzCevV6c8PDw8PNYO80oDi4wxEQAvA3gHgGEAhwF8yFr70vXrnoeHh4fHahG9hmPvAXDaWnsW\nAIwxXwbwPgBXfKG3tbXZ/v7+a7ikh4eHx88ennnmmQlrbfvV9ruWF3oPgCH19zCA116+kzHmowA+\nCgBbt27FkSNHruGSHh4eHj97MMacX81+rzopaq2931p7yFp7qL39qh8YDw8PD49XiGt5oV8A0Kf+\n7uU2Dw8PD491wLW80A8D2G2M2W6MiQP4IIAHr0+3PDw8PDzWildsQ7fWlo0x/w7A9wBEAHzWWvvi\nWs/zJ//xN+h8VWmLROg7Y4xRbRF33WXniBr5LoXHqGODgLZb16bO4fay+lqGr2WkU8bS9kgg16pW\nq0v+1ZfV/dTjuHx/t59uC/tbY6y65WP/4/8t2/5nf/4XAIBvfvs7YdvddxO1EY0W5BqpDADg3Ani\nsB944IFwWzlSAgBs2Somskw99Wlbd3PYFinR9mxlKmzr6KJxJMqxsC2V5r6npe3i4CQAoMjzGo3V\nyf48xfliTgZWoP62putlv4Z5AMDEnMzKxMwsAGBmWuYzkcoDAOanZfy5HI1xfpGuPzW4iMvxZ59b\nPr/rBbcW9Fqq1XY9rnE9Umr/1kf+zbK2rzw1t+xalWpFdwAAEPDzHOhxGd4WUU38NBjIfjH+qZ9T\nd4xqQiQa4dMm1OX5vaNfRpb6F+P9I+ocFb6+Vdev8HouVWQOi2UeY1UOdu+4mD4heD/13gHWdm+v\nhRSFtfY7AL5z1R09PDw8PF51XNML/Xog/ApHlku0WvJwEmwQLJfGA/XZrvk9CyWO5RJNMsFfaHWO\nivuiKkGlUi7XuIKTGpQ0YJyUo46tuC/u8nG531pCD8+lxmovG8OVUEtrcEgmUuHvYpHGnUw6yVj1\nCXEAQDlXDNt27iTJ+F1vFql9R0cLAKAp6Arb0g18fSVlBHHeFpPlVq7cCgCoGLrG9LxIyIPjJFGP\nTsv1XzxBEv3kxUm5Vpr6Xp+ROalUaYyVshxrYnTu5rZk2LawQJ2anVMS4k2MUKKtSH+j0XV/fNeE\nakk0pFpagJOq3UrUkrfl9ayPMhEnyetG0rysanMr0aqjnWIQKA0hME4KV9p6wBp0mfpesbK/5Wcd\nykJg2YodVc9uxKkNNq7Gw+etyJxUq+4do0a5RuXLh/57eHh4bBL4F7qHh4fHJsG662xVVnlqmVI0\nVmqztgaJoHZ31pRIjMwM+YKo4yMXLgEAFvOi+jQ1NgIA4oqAaWggk0OxXJK+83WjUSH7nJqoiRKn\nkbltS0lUJlsjkWVtWi1dLWEVkqw19mupbwl/z0yRWaOpvo3GEFODLdHvg7uFAP3DT78VANCzRc4x\nMUbXSDXKnDQ00mDTVhGaluZzdHYhbBoaIYJsbJxJyWm5/9EyLcvujBBW5hYylxyeHQzbykUmryNi\nrkkzA5uTQ1EN6I/hS+NhW30jqb+dPev+CKwKxWJxWdtGM7lAr/uQAFVmFTZruudKm1IrFTalQM7h\nnplqRJGSAa9F9fw7E0oQ1WZFun5Unc95NFQ0FxtxJhTU6BOW9akcjkE/u24/eU5s+J5QZiArxiHV\nA6wFXkL38PDw2CS4CT7xjihcTgAuIQ9rfKlqSa3Wuq+x+lbxV/Wl42cAAMdPnAs3TcyQpFgsydez\ntZkkym29nWFbJk1uc5k6ca9Lp0nya+8QqdW5Pzq3KACIcFtBaQGXQ2soTmoRMrW21L7ieZbsxy5S\nRiTkoERzUCqml523v5fG9fu/986wraW+FQDwHz7xZNj26NMnAQBxRbbecguRph//t3eHbd/6Lnmz\n/ujxl8O24ZFpAMDMHGkKlbImgmjcSaUiNWVIQt+3syNsiydZkjOyjLN8PluV9bKYp/EXlNhumDwz\nZuX5fOVYTp7Xalsqiy2V2ipKGzz+Es3hzl27wrZkMol1RQ034BV3r5TD305LLZakrVJx0q3TWuW+\nhhqKul/RqHNvlmvEmHjXbc5FcInLI2+PB0JyxpkANUpbdu8Ty89ztSrbAiY59bsrCO+olrzpd9mK\nllWq1nJJZuJ3yabIsv1WgpfQPTw8PDYJ/Avdw8PDY5Ng3U0uzoe7lilhiR+282u2y00vVaXKBs5v\nVBEwP3z0WQDAS8fPAgDiiXS4bXqOyDutAQ2PDAAAMimJSpyYIHNFMa8IwCY6z5bu+bCtu4tIxrMD\nQt61tBC5uLOP/LWjys/VESGaRKlwm9XmgFVaBowhFTIe177Z9DtdlojO4ye+BwCYbyezSsKIj+zH\nP/ZmAMBtihT95H8nlf+xZ2VeK22cyichauGTJygpXOffy9x99ctPAABmJ0W9dqapIEr9rW8Rc0hD\nC81ruk76ND9F+73w8mjY9nNdewEAuZxEIIL56XhMzFsTo1kAQKmg1PUYra256eX++qtFFS6ieHn0\npkYYV6Haat1OiWSmsZYLYiKbuEhrd1tflzqiwV1hTf2+Ei6P/9B9dGagqh4rP2PaKBCYK8uIxVxW\nfrOJs1a8hHueq8r04cwrVeU3bhwZqR7eSrEGsehMM9oMy8NwcRAAUIk4olRei8bZacLYFHWOGjfR\n8L3T0abuCP2OCSrODCONAb8XKkrONnZtcRJeQvfw8PDYJFh3Cb3Kvj+mxpd9aX6TMOtK2GasIztU\nZBe73/3kWamz8fTzp/l8dGxjTL56MUe8qS/l6AxJkueGJN17RzNJnLNMugHAdJYkqIELE2FbMkEi\n4stKQm9uaQIAvOUeio68e/++cJuL/KvqcdXI5RLUcG+sBedCeWC/kJIlQ9LqdKUxbHv2LF0jlSOJ\n973vviXc9ksf2AIAeOzwWNj27ceIRI3WN8m1GkmSbutulQ6kSeLJFcSVM54kSdtGdVScI7SpH3lF\njjXwUiha2d8kWXqsKgK0QGRsdlGiR7N5lm6VK2PV+aGpNZbPOeL5ykT11bGcAAujHJfkEqpFaC/P\nQ+K0NRedmM+LRGtZMi0rolRI1FcHwZJIbb5KWZ6dkGwMVqchlIvy7LjnPqqIz1iM1kzZjbWqxsrr\nJK5cDyNO8lbzGguWS+gRlryNIkDDAHXlchg4bVkT9G5sLgJbzzavTx1ZGuc+JVVUdKlI+5ULaj5L\nfF3F1EaiopGGlyivbX16Cd3Dw8Njk8C/0D08PDw2Cdbd5OKIz6VpQWv5nC9PX+t+JePSNniR/Ju/\n+ehzYVuc1bo4q4jnLgix1pAmX972hoycl/syMS/Rjq0N7Gut1MBMnHzSFypy/eMDFHlaUmpbdoHU\n/0eePErnahGycfsW8qsOlHmpGpJTmoBZXarUaITU1p4tYhp58dwAAOAbz/xT2DbDRGJDmcb4sY/d\nGW6LR+jYzz1wLGybLdJYo4rsKeTJTFKpinljfpauP5pS5qIIEZ7lymzYFuGxRdmElEqJL7tTeTWJ\n5dTleeXLf+LMCABg/21CFFYnyPyyWJZ7ZznKsL5eVO7Z6RL3SUxDa4Uj5cwSDX15BKBT17X05Mxq\nFeWbHeFj4pxGtVISE4VLNraQlbnuXG6FvK6wZbnXcb5fzU2yri5eorVu65R//womwVhkeWyGNiGG\npim3/nUEaJi4ankKWrUbYmxejOqEeaHJa7nJxSiSM7DufOpdxKahiku6VxUSNZ2mvvf2qOR0qSSf\nQ0WPsjlxdl5et8Mj5KCwUJD1XMjzmo3ImkxEPCnq4eHh8TOJdZfQJQWuzuUSWbJtyf6KxIhy6tvh\niZmw7fGnnwcAzGflS9raQF+8BBOW1YiQD3OLJKk1p1XEGJMzuaJI487VMKuklmaWmvvaxEVvZpo0\nhJFZuX6Biay6GGkBTxwWwrb5rXRsY0ai/gLntqVdnwLntrWym125ROMYuzQtfRqiL39GaQ1dnXS9\nt76HXA/roiJ5PfoDcj08fFg0mVQduWMWikoGYLLn4ktCAMfrSNKebZDxt7eQe93oqNwndxYXURuP\nyz1xcx01Ir2VeS6amxvCtok5Tqk7KW6o4HwZOsrUBVTGY9L3dB3di7NnRQpeK4IaKY2LnCdoSeSv\nk8ZrFDZJJES63bVjOwDgll10Tx5/RDSq556jeV2ao+f6iuiXnyWqCNsod709I+skFyeNaz5QboMr\nKJAxlcukGmGSMZB1EkZZ83xFlUNkxTlAqCIRNs/HKqWgGnPPqSJUuX9Jlfsm4QrmGHnuq9w/nZup\nytpSEBLQ6r3CTgHtjdKnSomIbH2bAiZIGxuko/MsrS/mRZN0Z6koK0Bpjff2qhK6MeazxpgxY8wL\nqq3FGPOQMeYU/9u80jk8PDw8PF59rMbk8nkA913W9kkAD1trdwN4mP/28PDw8FhHXNXkYq19zBjT\nf1nz+wC8hX9/AcAjAD7xSjrg1NGl/tXOD1s1sWqkfc7HOR3rlx78Ydi2mCUzgE59m88TkdTXQYrE\nXF5U39lZUhsLeSGbnA/pYlEIi3KBVPNcUUisEhM13e2ihs7NkKljLitRfotMijiCZWxK/IsfeZKi\nWN/7zjeGbZbPW6suadWubHJx+bxKyq/7jfeSur61+9awbUvbHQCAnj5SGyfmRJWczZMpI56UCMx4\nQOcoqci6rR1ky6gUZRkNjZKZZm5S9uuqJ/NGRBFV5jKSV6eHrWMbia7Ok+KaoyVFSqW4UtH0nKio\n3W00nksjcu+yWZ67Ol1ZhqMCY9dirmBSVJkBd+3YBgDYvnV72ObMSnMLElE8Pk4+/nOzMsflIq3B\nxXlaQ2OXLoTbXMWs+Xk5R5gpugZ5Xiv1ss5YZWqYa8QXYfm25gYydaWUKSnmzBDaDLiCT7qBIoDD\ndLTq+i45FZ+uLLuHsQQqXx1CC4oyzRUKvGaUv7qrDlRQJlRHbkYiivhN0AmrJVXtyqW+rbr1Kueo\nYwK0qjoavp6Uf72LjTh3/nzYNjNH67Oq4iqcM4D2dbdrjAJ+paRop7X2Ev8eAdB5pR2NMR81xhwx\nxhwZHx+/0m4eHh4eHteIayZFrbXWrJCD1Fp7P4D7AeDQoUMr7af/WtbmckQUCvJF/f7DTwMAzgwK\neefcwRpSKjcIE3UZ/veWPiEgL/IMNKggrQpHg0Zjco46Jq/qovL1bmijVLGvedPPhW07bzsAAJj9\n8t+GbaeGKZI0CkeOyRf9whRJaAs5kSgbOIeJTgFcdtLqVdwWHQHz7rdL26GDXIgiJml+Q/cyzn3R\n3Slug/v2vB4AkKkXV85PffoxAMD4qJA497yGolFv7XtL2Pb33/0sAGB6ZiRs6+/qBQDEY0pqCqif\npSrX9pwWcnI+x9KQ1kYqtL2oa2rGuMBIn3IvzVA0bDkvxFYxR79LBUWA8qnr0zLuy6GJzVpwOTea\n62XxvP4QaT5YkLWbZ8mwa1tP2LZ/zw4AwOKiaIZHDj8FAPjS33wOAPDMcz8Nt2VZM0wvCNntxhME\nKvdImL5ZtbHYWFbirUyj7GfZNbTK6YgTMZmbrkZ6ZhYmjodt1RiNO6nmqWCv7LZYVuveEa6BTnDi\nXAOdy1+gt9F5AyWNGxcValRaXpauA8WUhvlf1PMU477HldboCmxo04CTzF0ilmhC7c+FUxYKsibn\nWeOHSuk8Nkla1ZSqkVuxMb7Uco3bQLkq1kizuxJeqYQ+aozpBgD+d+wq+3t4eHh4vMp4pS/0BwF8\nmH9/GMA3rk93PDw8PDxeKa5qcjHGPAAiQNuMMcMAPg3gDwB81RjzEQDnAXzglXagVlKusIqRIkBd\nROEzzx4N284MXAQgCXEAoMRq5aJSg7b3EKHTy0mkUmkhMVs4UnRqUsw2jsTs724P25qbUnxeUZvu\nuYdMDne85rVhmyNPR5Rv/Km/+ipt475lVXWkLZ1kBjk7LNc/sI+ItYpOncnH1p4vQZwTj+3fKepy\nokoEbVml/nXclUsPalSSqoATW/3CW3eEbY6f+5M/+1HYZrOshmelAlQqQeO5tCAJsxIJmvdiSfz1\nM0xQNtfRdVvqxb+8vYPU4Y4WMfm0NCR4fLImyqyOxlOiXieSpMo+UxJV9iLrj9E6mZP8IptrlB/w\nmmFdbIA0HT9B3r1JpSq72IDktFS72rNnDwCgLi7jedPraT01ZGg85y9cDLdVxmk+XcQoAOy/lRKq\nzc8LAe9I/uySVLW0npdW1qF5iul6uLzG3EuhOC1r8tQgUWb779wbtl1yFYbUXNvgyiaCOhVRGjKe\nOnrTcbecTEufyzk25FWk8CInL+vslOfUHZtXqYfjHH+S0GPl2qMF1XcXZa1J3irHnaTY5JpUa+jS\nOJm/FubkWc/OLfB+sp6rxplm1Hn5Gjp6Now81iaXNfqhr8bL5UNX2PS2NV3Jw8PDw+NVxbpHikoV\nbV2Dj/5VfAXGmaj8yUsDYdsiS0jRmJBSARMkZeW2t8guhP19RM4194hL2SUmIMfHxUWsKUNf4R3b\nxXnn3CCRfMmMpKDdsWMnACCtCNigjqTWX3jPe8K2qXmSJJ589BEAQGFepAx3A44dezFs62mjr3t7\ni1zLMMkVucoHO5PiSMmEFLOwAUsmmnhm0qbCvmpRnVqUax8a5Y71sX+1FQDwhjf987Bt4BxJTWeH\nRUJJ/ZTmMxMTCbGvh67/nndL/Nnb30xzd2g/Se8dLSK9prg4R2FR5WPJcV9KegI4orckfZ/K0jFb\nm+WeZKJ0zPEz4mU1Mbc8pemawfM5MyOuhI+epSjg+qSKKK4nzaSlSSTJUpHmp7FR7nFXF+UE6eig\n/dKqfm005txrhdjd3kv7t7RI+mJHsk7PCHk6MUWk/PiEaE1zXNiloEhZBHTvqkxe/uTZR8JNt+/s\nBgDEOuSZmLs4wJ1TrxGznOQLT68iIK2layjvwtDlz+XtsWqxx9N0r48deyZse/a5wwCAD/zqr4Rt\n3Z3d3DmV5pq172xMpOEUuxwWVd6ougw9dzptcJK1ihzPycTYpXBbjN0cI+rdFbA0XqqKNjA6Sc/H\n4JCk2e7uomjguIpUTURdfhvlr3m9I0U9PDw8PDYG/Avdw8PDY5Ng3U0utkaCI0ltKarP80dPAACa\nVJrbOibAzl0UVabMarhOX9mYoWOaWTW98zVSzSdgEjMWeSpsq28mNdiqCiKnh0hd7+zeErYNj5Aq\nd4+KCnMVi1oaJWHUzl5SU6d3EsloBoXsyudI9W2qF/V6cpLU6+52UaWLjii5igYWVOl8gTKXsHar\ng9dg+I/AperU6T7ZSblYkgOiVTJR3LFLxnXbXjr2h4/ItZo/RKaUPXskHW9fL5laWlKvCdvihtTQ\niosQrYg6HkuR6ptLyrWyETJRLKrIylKOo4eVWNKWpnv25v2yTu7cQ0Te+UExtT3xU7p3335CTFNr\nRZi+NSpkLxrJ1/zCkJjQXj49AACIBaKG19Vxtae2Njm0kfp8cYgiCoeGpWJWvkDzo6yLuDRM542o\n6NkkR9l2t4opp28LX0ORoi718cioPDsXxsj88vQRMmtM52WuZ6M0roFZMYNV4xwpabWJ4Moml3xW\nzEAJToAXUQnTImEqba5ips7l6uzqqOyZGXomT56UNM/NbMKqV89TnInniSkxuY1N0TNmVX3hBSZo\nmxpk7kolmhNnhikrB4xFTmU8PyVr6NxpuneDQ0Io19XTc7xz121hW3aB3juLFVVtiS1CqaQyA6Wu\nHCdRC15C9/Dw8NgkuGkk9CW5XLitpCLbhgaHASwNlOxtJcmoVBAJ6fQwET8RFeWZTtJXbksfuQPW\nZ2T/7btIoty1e3fYNjFFkkR2UdW05Gi8TFT6NHrJSdrL60fGojKeW24h97J7XvsmAMCps2fDbQ89\n+DUag8ol88LxMwCAVpUqtqODJJPSVeo3FhZIqnr5RdECtnaTJBVTtRwiXHQiiNO/plEk2iqP9ejL\n4vrVyJLf3m1C4ljOg/G6u4Uoc5GKdWmVUpQjJa3K11JYpHu8OM35Owoi5dXVkyRVp9wWM11UCGSu\nQYjV7zxIpFhOEXvvec89AIColb5npmhNdHXImjiwn6TWd71DNIO1wrmXVcoy1gS7bW7bIRrK4iJJ\ncPMLIvEucF6Xc0pCLpwdAADYRZIeW1tEep9m10RN9p99mbTWQwduD9smJuh8+axcq8r+gPGEREhH\nOMqxu0Ou8ZPHngAA5M7SfDWX5fUwP0xjyM3I9d0vrTRGVlAh61TeHMNpc5cUswlcjVLOaaKTubDs\n2dUhWms9uxy/+KJI6A31tD4OHrgjbItz9HS7cm9McfGQ8Vkh7w33vay0xTLXUJ1kQnnwvKSKnp2k\n98Tpk6fCtixrMF3d28K2fbeSZtioHCqcOySqOnqa3hl59Szki7KOVwMvoXt4eHhsEvgXuoeHh8cm\nwbqbXMKqRErzclGh584LsfDyBVIlC8rneGyWVO1MSpiiTNL9FtNImknRvt2UPjaVlP2b20mVv/2g\nEHZHnqN6pK5mIgC0NtbzsaK27thJ5hqjo+OYWGlulURYb3n7OwAAuTztd/K4JF2KsY90TqXsPMER\nsJGk2Eh+/s6DAIB6o1ixGnDRaI89NhC2vf/duwAAmTpR5YocDRoUaK6jKulVop3UcGuFxErW07ht\nQpkouJZqnRHTlEs2VFGaYpXV+2hMrlEXp7GlMjSeQlZF+2bpGgtjMifxJIV71iVkyf7c64hkzhfF\n5zjOpHBSJRZLsj93YVbIq0YmBVsatFq/FFer3+qMDYGq++jSGxsVlZhuoDVW3ySmKRcpWFA1JUts\nklqcJ8J4cnQ43OZU/5i6h+fOkKnh1CmJ3pxnP/yxcfE5b2slM0Q6LQSgNXQ/f/r8QNj2g28+CAB4\n7X4yEVRUatcSyDQxMyYmh0SGzB9Gse3V6pXnLJ1UFYicuUoRqkHURXTSfjqNbJGnWJOEbfyMvXRS\nEobVN1AcQAc/1wDQ20uODFVlSok45wVFgGa5hvDAiZfDtrOnTwMA8jl6Xto75R7efYicK9LKlNVQ\nR6bRrm4h4KvgZ2dJPuAIj1+llOZo1IR6x0xMyX1cDbyE7uHh4bFJsO4Sekip1EhZefSYfHnLTCLo\nlLrjnHo2Xy+Sh4vKa06JpNDRTV/VVAN9PRNxlUaTScHb9guJhSRJdN/++j/IOdgNrHvHvrBtN5Od\n0UC0gYCj7WIJkSQsuwbmC+wqtSiSr4syLRVlDLEU/b6DSVQAaE2w6+VFIXFqoWpov9vv2Bm2Oa0m\n0DkqEjSehEvfmZ2VkzA59Zo9IuXEUkxYqVSlnAEUFaUNVVnS1xGALn1pFTr1qTsH1xRtFne0WCPt\nX1oQF7niHI07PyPSeJo1o/qkijLOkRRWLIg0HnCUYUNakVJgd1WszS1sKVydyeVFGjQ36JZstUYq\n1FhMpLF4nNM8p4kMz6i8IV3srjc5Jq6M0/MkNX7m838dtrm6pVq7yLCLZGOzijyO0Lm//9DhsG1m\nmDTSlnbOW9IphGlLA0nDZwbl+tF6uicdSmrVtWEvR1Wp4Ql2TZ1X9zg7TfeuxG579fWKxeccNvms\nrLUtvbTGDx8VjXdqjt4JQyOiXVfZDTKXF7VxjknjS0PiPDA8QNrH2KhIxdt2kHa7Yy899339QnZu\n66Xo6YzSGkyF1vPMlIzLsMuvrajUuyyZW/VMLLBmulhQDgXWR4p6eHh4/Exi/SV04zKMyZdqhm1Z\nWsq59xayST1/Sso4uSIJ07OSS8Nm6GtcZ8RFrZkleKn0Ld8xVx0+0yUuTfmKy7AmEkWVXYq29PWH\nbXUc5aFzeSzMk8vdnttvCduCKH3BTYmkhnS92NfLcMEpMtbWJg6OSEpwRP+d5JpWWTiJlRABl5Sb\nEsmjrYWkwGNHRQv43NdJqvkXv0z9/GcHVfX1WdIgoovKpSvH7o0qb4ezQ0JlDIzy9sCqAgvsallW\nnqmuDRWXCVCXRyMkG6S/CZZW8wuqtB+7nDmpCAAWFxZ5zDJ3hot42JiyYXKOjmjhlbstXgtqBdSF\n8hVPQKZRtJa7X0P3/5mfPBG2PfQDynzZs0WC3VIsIbe3iHunq5p4cUS0lixnLQxUfpN9dxGP1L6b\npNL6TnERTKVoPttUGTfn3jszI7l82tvlObocC4uqfGOWnpnpabmf1SINPHzuVOGOKGfnXMiJhlbl\nIMI9+0Rr3rmL+ITnnpesrE8+wUVk5kULDV1OC+Ly6iTtPdtFCt9zG/FuJZ7XICFaw5HDT/M5RGto\nbSKtVqWtgWF7ua0q+ZkLdegSc9WwAI4ulbg2eAndw8PDY5PAv9A9PDw8NglWU+CiD8BfgQpBWwD3\nW2v/1BjTAuArAPoBDAD4gNV+bquEI0Crilg8fY7MKo4kAoDcIukwZWUuqeMI0LTK2zA0QiYPXauv\nqZlMHDG2a5TLog+5pP+6zmcDR002qVShoyOkVm7d2q16T8ecHxYC5thPKSdMW5cQiik2A1wcPMNj\nETUvyn1qVgUepuf4vIPitlbYQwTQtCo6UAsBR4COTXeFbc0zRGQ9/KTU+fzG9wYAABOTRAAd+KPX\nhdsaIzQ/UaPJTmqrqpqONscuj4oBNRyOWomrlMIctavd2wIpWU/7V2T+DfuoWV0D05GnaSERY3z/\n56dEla4scgSiUmVjLkQ2UG5zvIxiddeQPvca4ExMmuS33OZU70AVOvgJ51d5/vkXwrZ5tpLtu01c\nbtub2ExVFtNEvkTn267cdUvsXnvHnfI8pVLkUFDhZ2JR1VlwUdsxdQ/b2b01UFHeFeX+ejlm5xXJ\nzc99SRF3LRbSAAAePUlEQVT1KTZ5xDgCuazMYQtZMq9l56StkevBvued7wrbxifIvPLoaSm60t1F\nJqneLRIN3tlJz0d7qzx3jexQkVBmlQK7Grr0uZmErOsT4+RKHQlkXg17Cjj3RQCIhK7Z2kXWzZOq\n6cpzscT18yoFbS7HavYuA/i4tfZWAPcC+E1jzK0APgngYWvtbgAP898eHh4eHuuE1VQsugTgEv+e\nN8YcB9AD4H2g0nQA8AUAjwD4xFo7kEyQdL2gihmcGSDJNJWSr+fwKAWW6CrpmQR9Ge/aJQRkB39l\nx8cks1pfPxGqhnNalBWx476GJRXYk06ThH7XPW8I2wbZvenFoz8O2378IyJ0brvz3rBtnosJDJ8W\n8rK5laSB8wMDAICZOXGL6mACdHZGcm/kJom8asmqggAsZc5HrywBAUChSGTTQz8W8jjO5NL5C0Je\ndfTQ3J3n8mznLsm3fXcnzVPFyPXTnBUvUMFBzl2vrIpOGHYvM0W5Ty5rpVXV3hwpariyfERL764U\nWFyktzBHiCpZZpmAre8UaSjFLnrJqKKT3NBUWsY4S0hlDvZaL+8ATQZXrSu9xtK7ks4sFyTp2tIb\ntu28jdZnfZOQl7fcSoTm+dMnwravP0Dut7fdKflN+vr7+Fh5xorMnrpsmwFE8oxgKYkNABWnUSgt\nY6VgLC28BxEmGaNKamUt3aVBUko7Ehyc1lovEnJjhp0d6uTuTY3SGn/D3aJxtraThB6PK9fgOL13\ntHZZ5iCnoKLaWJsvF4lEHh0WrbnCwV7btss9cXOmKgWGpSeNXU7Aa5fXMNiorN1gr7+EHsIY0w/g\nTgBPA+jklz0AjIBMMh4eHh4e64RVv9CNMRkAfwfgd6y1c3qbJUNgTQ8bY8xHjTFHjDFHxsfHa+3i\n4eHh4XEdsCpN0xgTA73Mv2it/To3jxpjuq21l4wx3QDGah1rrb0fwP0AcOjQoeUvfW4Zn1CRivyd\ncTlYAKC9hdTq1naJXpuYINPE8QHJL3HnfvJJ3bdL1KCWVs5NwqpPoFLQurwJi8rnOsfVxHN5lcaS\no8yeevzxsG2Rty9khRQqc+rTxTnhh3u2UYX3yQluU7lfHDk3VRIS641cJOG9/XvCtiL7wc5GVja5\n5Ar0rT1xRvzQ4xWax7FRud0JkGkqU0fq+slzMv9bOKCwrPxhK6wOJlVbhFMJW+Vf7ooUxFSa1yqT\nnBVFCkXYDBONsj90o6j+tkJmo6qRNMdlJqMiiux0uTGsKjpQ5vw3h5+QfBwznN8kOy/xAqcukk48\nPkXn+8M/x3XB1fO/XBluFCY0K8m4GlrIbNDU2hO2VXniXX1SAHj0SfJTP/z0Y2HbBK/tpIp/GJ+i\ntlJZZLMWdh7oZr/26RlFNvP9rKjxLUl5zbArRDZWFAG6wHV+Y3FZk0V23nbXWFwQE0Ud1/lNJ1RU\nMJtQnekDABrZX37rPa8N26Zm6P5fuCTPZKlM6y5RJyacRJL6ElUpsiOBM8lxBOi4nOPoUSpiMjku\nc3jLPooXSMTl3RVOmRq/I0CNWs8Vfj/Zsi60ijXhqhK6oRX6GQDHrbV/rDY9CODD/PvDAL6xtkt7\neHh4eFxPrEZCfwOA3wBwzBjjkib8FwB/AOCrxpiPADgP4AOvpAMlZg+eevr5sM0RARkloW/bRpLJ\niPpCXuJSbYsqA+Nsjr78rz90MGxzLlLONc6q3CPlkpMC5Nt29iwlrD83IIUoxoaJZMyra8X56/7c\n0efCtkYuwGCj4l73xPe/DQB4/slHAAB9+yTPimEpJ70gX+W7t5GWUS4IszL6PZK8JqfE9bAWtnAS\n//5OiR585mVyWwzKMu7ubspD4aLSzp4XwvQNezg7XFXcQatu7hRhlGLpysSkrcjujUVFaDkJRSUg\nhHUl3utIezo7JoRVVz0XPbHKRXOBpXtVJCHgKF9ERLp//hitj9//b09Kn/harc1yj09Pkfvp4EVS\nLK+XhH4tULoH/V8Ju2UuVabdMd0ORknKzg2wq2dr2HbrQfrd0SVrwjkBaDfDEkvhc5wPpawcBUIo\nYnklabwWkkoan+Ho7qrKb1ICrR3Dmm9TnWhoRVf0IZDslPOcj2VqUjSvGJctzM7Let6xbQdfX9bz\nuQF6Jop5Gb8JaN0HaqEGLCInmFBNqXO8822URXVphkkaT7Ek2kU06vIgyfidtqpdc6tOQlcZGKvq\nmV0NVuPl8jiWJLddgret6WoeHh4eHq8afKSoh4eHxybBuifnGhknYnNuToidznYiZwpFUa8ucUrL\nUwOSvtP5d9arhPAuKdTIqBA601OkfpWYeNREhCs0UFIkXoWrqE9OiMo/xead82NCgBhWP7Mqom37\nDlLJYiqx1tQg+QQ3piM8PoniPHecTDmvT4g6nGkjonJmUMwrL02SGcjsEJ/rWihwQvxMXNThEptJ\nIoq9zLBpqMr+34NDwmnn86RylityT2yK9qtaTeIQURUUdRvNY0SRZwkmNJPq+pZNCP/7ixTR95kv\nvRRuu2Mvzc/v/Sepkt7fyGq1KrpguTgH8iKXjA0TATabkTne0tHB/8o6ef6fyJ945iomrJsNBssJ\n9UCt55Y2Gndbm0QqV/mZKKrnyRGamth0qW9d0Q1N8K7VvFILvd3i0BCL071YUER1uUyxC6Epb0kE\nqosoljE88fgjAIC9e8W8muJ01FPTYprNztPa7u4QE04EtP7PDInn3dwsmWvnc7KemjhBmrF03rhy\nqEjyGHTRnVyO3h3ZnCQdM+xzn1TpkJMcSe1SS9PY6NmpKif20gqRt7XgJXQPDw+PTYJ1l9CfOUbu\nZYt5FUZYJZLr9OkzYdOFcZKQi4okaGVXt1RC5VJgCaaQky+/K+1VZJJH56MoFpcTPy3N7cvOW2RX\nspwqI+XKgiVUUv8SkyEjFwfCtgV2ZcyzO9LZQXEp7KsS8btnq7ijXZwmqXF2UqTHSidJ/CEReAU8\n8sNHAABWldvKctGHtNJkHPG8l4t0HPuxXGuECw20SQAiipxuNV6Rc5QiTlpSBT5YqtNl/mIsQVZK\nIt1kmTX9zGeeBQBcnBRJ5PwwSeu7tkkK2E9+jH4vnBCyK8VEVWKnHJvlLlWCPhlrlAYyVxDt7uwg\nlRarVq5d8ryR0L11bnt2iSsnE8oq544rixcoQrNaoxDGsmtdB6lcY25GNN6ZCZJgIxEhIBu5RF4Q\nkCathdMyS8GJtIxh/wFKbdusSvtFeY1HYrJfoUBrJhpI9HJTPY1/V79oDUOsfU/MiRY+O0fvkRJH\nSNepPC+xyHKHCqe1G5U3yLl/Lo7L2q3niPZGVZwn6u6ukYEH19tt0cPDw8NjY8C/0D08PDw2Cdbd\n5HL+AqlhmgBxtRKrKrLKMGHQ0iRqeP92Sod518H9YVt2lkjBk8ekzqCrspNnv25VzAhlNqFEY6oW\nolnuB1/i5GEJlTCpkdP2JpR/bcDq7dAFUS/ru9gnuIO25QYkOdfdKfKHTrZJvcf8eTIHDEfFDGXT\nnNrUruyX+vizVAm+r01UuTmuqFRMiHkpxQnIBs9yHcUZIUD/8YeUYGzfLiFx+vqIqG5rUmRrlM1a\nym82xSltYzHp5yKbtVxqXwAYGid1dZZzwLY2iylnge/JPz4iJreP/zpVWJ+5KPM6ygRte+uOsO3c\nMPWvr/G09LNIxHtZXT/KdSt39Uhit9XA1kpEpdtY5V6yX43ao68U2kIi8RQ6iRc9Ryp4EsZVBath\nXalFfLp/g2vwOa+F6WkxObgqShUVa7GYo/tU30j3aSGrqmjxeNIqOZszf85eFFPawYMH6NgFqc7k\noj1TKXkmCmzijakKXJ3t9AwGMWkbGaM+5ypF7qM8kxlesomUSunMmcUa6qXN+aHPqMjXGTbDZFUF\nrnQdvU/icWXWia7N5uIldA8PD49NgnWX0Gdn6AvV1SqSd4mJz44OyT2RZ9fAopJQ93Al7l/6lV8N\n2/7hbx8AADSotKDnTh0HAPR10zXS3SofBkeRFlS6V+fe1arO0cv1Fadn5AsdZXe8XFa+smPsInn0\npESZ7usnF7J4HX21uyryBe5opj7NLEoul1yRrmG7lJsVE7RXy3efqiPJe2ZeztfEeTA6uqQ4R08P\np09lIur4cYnUPXyMpIfJcXFlbH2Zfnd1idbS283jaRVppLOTJL5YWeVX4RwdkYS0DY/RGHNV6mdQ\nFQ2pvZnmuqpqSp6dIGls2939YVuF793ps+Ki9twLNLfpQOZ4ntMQR6qynu587T0AgL1b9+JKqJWr\nRCN0F6yq6FWX+lZnRYWL6FxOSmpcLgUvSa3L++s+BWFKYd0plq6VZFcqLU/b6s69RAtegSC9lhw1\nDsmMEJCTWdICGxpFaj5+jIp3PPFjqpU6NimabKaBnoU7DtwetlU43e0b7pX01ZcukBtsPKpfbdR3\nVyQDkAjZ7II8z/MLJIXXxaRPbQ203kcnSBvV8+9qpOYK8qy5aNglGhrnQWpWeWPqk+QOmVX5oha4\niM/4tM4zvTbNyEvoHh4eHpsE/oXu4eHhsUmw7iaXzjYyOZRUFFuU1ZpGVZ2kr4t8TUcXREXp7+8H\nICotADz/AqW0zE2IGl44TFWGtm4h00dPz7Zwm6vjuKgiuwo5MpvUqWjLuw6Qeceq+oEvniZC0USV\nHzqTPBcvKXMF161sbyHSZXdUokjneIhTI7L/RIRUw2pCSMlojXSbtfBr76eEQZOzol669MGIiWkk\n1dhP502wj7jyrx+ao2u1tcr1G6tkBjl1Rvz7z5yi352toqLu3E7Hbt0q97OznU1HVWkbnyRfX5ew\nqLFRSOGDByjyb3BI0iKfGWK/4QZl3uBI4akpMZfN5znBU1pSD6fSdMycuscdnCr2vCLULsfMjJB4\njiDU5osYOwlXFQPpjmlpEfNOmaMBbbDchKKJRxPWV60s+VtjQZFoGSa2jTqvOzauYiOcz3Mt1LqG\n7tP1RKWsIlWZ3CwXxbxw9gwR2UPnKXr6wF13htvaOCjCmQgBoH87PcftzRI9HWXyslXNv6spOzoq\nhHrApqn6OjHN5Vzd0vkJtR89M62NnL56VtZawAn4AhWHUeG4hiV1Rtl0WIauzsREbZ2YcJxjiIWQ\nwbmszNlq4CV0Dw8Pj02CdZfQm7ggwZTKX3DrPnJD61ak6OQYfb137xTJ6+Cd5KKUUgnpW5hkGT0v\nEurWbvqSb+faozkVHDrD+RvmZkUaW5ghaTkG+TqmOJfJXbf3h21ldqUcVek7i+NE5HTUi3RbLZAU\nMjZG0t19jVLbsTRHhMrZcYkezXaQ2J5QJJorHIHYykTdnn3Uv0BF4BWYUD4zJNd48QUiQQ2TZznl\njrWwSNc6dlYVJGCXq742JeWn6ffkgnI5fZ7m7tyARJ7u3E7aVXuX9Gl6NuBx0by6nBkAMDnptCs5\n77kBzpGxV+XcKbm+i9S+OEVSWHNKcuPYEkf+KQl9aIJyeFSVK9vlmJgQSc1J5nMqirCWhO6k29lZ\nySWU4tqXUzOiNToJuqFBiHd3D1zum6Wpben6msQc5/7VpWSt5fO0npxUCgC33SY5cdYT02qNB5zX\nx6j0ue55791C+WgKCyotbhtt27u9P2zbs4fclnNZuSclftZmp2Tu2tpIQ02o+qGL7ISgn6audtJ4\nFgty7OQ0a8uc6ykeyPpbzDqXZ10IhtuU1p5Msial0kyj6uq2ChL8HtN5mDKqUMZq4CV0Dw8Pj00C\n/0L38PDw2CS4qsnFGJME8BiABO//NWvtp40xLQC+AqAfwACAD1hrp690nithkVWZfTvFN3zfLvKR\nzqSFPLw4QqaMli3iS93eRkm08guiGrc0k4pSVyemgXe+/5cBAN1b+wEAE1NiXjl3iiIrExBf0mqB\nVL2UIpacGpxRyXTuOUjmn+eePxW2ZXPO1COq8YWLpN4X2bxybI+kdjWc9Gq8SVTkDI/BlsRE4KIM\no1fxjS5ViQCLR8Su5Oo2JhNy7OB58s2PJUnlX1wUsqfMhNXorKqYxDUoX0zIkuntoTnZs1MIzQyr\n/5dmpe+XniWSq7Vd5jMaJTOMI+UaGyUOYSFLfXnbW98Rtk2OnwQAnBqSuZ5l08z4jPQpHZDJ5+LQ\ncNiWCGgcJ88rcw3XVO3p240roa9PEnw5MrKjQ9LSWo6XSCuThxuPJi9jSZqnLrV2nZ+yJi9d2toU\nn6+siGq3f1T5V7t7tsQ3vUak6s2CwMp40uyTnVDraffOfgBAM6//dJ2YG5zFMaMS5s2Ok3ktriK1\nnVklohJcTXGSu7Kuc8vJ6zQnXObw1bqEPOOxNnoHLXJlo0KzzPXcPN3/skqEF0tQn6dnxFw0N0dr\n0kTFySPgexZViQJdYjv9Uo5E1laxaDUSegHAW621BwAcBHCfMeZeAJ8E8LC1djeAh/lvDw8PD491\nwmpK0FkATtyI8X8WwPsAvIXbvwDgEQCfWGsH+jht7G27e8O2Vnbvi6jckT19tF/9FiG7Gjl6bODc\nibBtbJyk4RaV4D+eISl0kvMnzC0IYTo9RpJcZ5Mi+ziXQjwubU4K0pGa7S0kBe1VfW9pJ0lzbk7l\nYSnTF/8Sk3PfnxkIt/3qfT8PANizV/KRzHPk6XOPfTdsC5ggjV4lt8PEAkkSUSPStXPRam8TCbFa\nfgoAcOwo5UtZErlYdceqNiZZ51QxixNnSeMYGBTJp7WJpJDeLTJ3jWmSOHMjsl8sIC0pHqP9GxqE\nFO3soHs9qVxPz50mKWt7m7RNjZLkc/KszHUnR/eePC8aR1cH9bmnWZGnRbp+cfIkrgQt5Tq3Su22\naDmlqyaUnXSt8wC5qM2qci8MXeRU9KKTvh0ZWiuXipb8Hdl3tUIUTtK/HtGe14K0KvDQ0EDPRLkk\njgc9PfRst/EzFCh5M1RCluTUpXVaqspaj3KaW1utoaGoPDBRrjNbUW0JJi2NkXvstiYiHIGrnol4\nsyv6op5J3q8+I2T3yCit2bl5RXLn6TxVpUk4l2RdsCSRWNs9W5UN3RgT4QLRYwAestY+DaDTWnvJ\n9RlA5xWO/agx5ogx5sj4+HitXTw8PDw8rgNW9UK31lastQcB9AK4xxhz+2XbLVA7lZy19n5r7SFr\n7aH29vZr7rCHh4eHR22syQ/dWjtjjPkhgPsAjBpjuq21l4wx3SDpfc24dQ+llm1qUomoWL3XAWvx\nFJETW3qEPE0wydfYJOp6mSu2ZJXJ43vf+iYAiZi85Y67wm0xTm2ZTAkBm2TiRVdTcSYJraKl2Q97\n+zYhORsbSJUfGZGpzTfRuU2Uv3kqSdWhd7yRzqsqMZ04QWaASFRXQuEfwcoqWDEgdbGiEoBl50hN\nj1ql3vaS+eWZo3StfE5MFHH2q00EQuLkmLDTIkCUayXmlRo8NDLH/4ppoLme+rK1W+a4pYFO5Pyl\njfIIrrLZIJ+X/k7P0++zA6IOz45ymuWyqPJlTtWbm5H9gkaasx0S1hD2ea6gEiFdhnPnzoW/nblC\nm0haONp0YmK55qnJrhST+1llmmlnc4k24TgfcmfC0dGzzv9dtzkfdu1z7kwu2rxytSRjgupl/9aC\nXn9BjbYVjlRrp8LjtkvMetTPGD9/FRWbUi7T/lE9Lpe9WNXldHWGK6otzvU7K7oeqzNDqT5FnGlM\nVbGK8vMW40pFCWWadGRoUUXAltjUk4iKybGNzbkplZbXFUqbV+ZfY9x2ZWrT0aWrwFUldGNMuzGm\niX+nALwDwAkADwL4MO/2YQDfWNOVPTw8PDyuK1YjoXcD+IIxJgL6AHzVWvstY8yPAXzVGPMRAOcB\nfOCVdKCtmSTzpHJHct9Hze/09pIkf/CARFmC3aC025KL3ltcENfEqSGqWzp1kSSueuV62JziitxJ\nkUZDAlT18/Lk//QHSQH1Ki1mzBChk1ARnVNMxjYniKg9/fJAuO1LX6F0v3epcQ0yyRtRkW0u9Wo0\nsvI3uJQjaaGi1JvmVpIkdeTh/oNcjzVDOTImVarS3m7qZ0NGJN+XTtAc/uQZSbM7PMTRoFpqYilH\nk1IznJZ07oxIMnVMlDaw9G5VDczBwQEAwC23iGWvd+suAMAzx8RtsaeZ7mOxqshGvkQ6I5LNFOfX\nyS3I+KdZgVgoXTlXhnZRdHPX3CzulU5CbmxqXLafnusk17nUbogOWrp20rqrc5tS7pCOZNUSejS6\n/PG9scTn2q6lnREcKVyuqLwlLIV3dNEcZ+rknVAu8HpSrocFdtuMKsnXLbuIrhvM90LzpCbKmqEa\nQoHnP1Br0WXrdtpCRI05zs9/RGnNUS66UlCaV4qL5yQaZL8SWxJi6j2R4zxEVSNrolham4S+Gi+X\nowDurNE+CeBta7qah4eHh8erBh8p6uHh4bFJsO7JuVxiIatUGfdLmzeaOXlTQ50Qa0Um6uZVmlNb\nobYG4STQ2UwmkfpW9sMuSBRXstH5l2viyJlXlhNLRrU5ksUoZsWpxnVpMcO4ykeO7GpqFx/VmSn2\npT7yVNhmwASQ6lOM1baUIm9rwUUb6ghER6jpRFBO1a+VuCnKfutWqcNd3UT8vu7e14ZtTz11BADw\noyefDtumpriWo9Jla6aB5bqxUSaeE8rk1dVF/sgTys11cpKigV8+L4mYRsdpjGMTQuiWmNCyRsir\nVIrmsVCQ9ZRbpLaKvbLZoKtLyG5Hiteq8LPE55xNXbUqEtU6Vq/xy9PW6m0rXX+joKxsHuWyqzMr\n2xNsdg0dEBQ56YjKjPJlnyvQfXdJ8gDAuhS0FZmngC+i56tU4uurOXeJ4kol5evOp3EOEhFFrAZl\nVytWzhE4k496nznzo61KNLqzEjfUy7FRHn+hrJ6dgphfVgMvoXt4eHhsEpgbmfPBGDMOIAtg4mr7\n3uRow8YfA7A5xuHHcHNgM4wBuHnHsc1ae9VAnhv6QgcAY8wRa+2hG3rR64zNMAZgc4zDj+HmwGYY\nA7Dxx+FNLh4eHh6bBP6F7uHh4bFJsB4v9PvX4ZrXG5thDMDmGIcfw82BzTAGYIOP44bb0D08PDw8\nXh14k4uHh4fHJsENfaEbY+4zxpw0xpw2xmyICkfGmD5jzA+NMS8ZY140xvw2t7cYYx4yxpzif5uv\ndq71Bue1f84Y8y3+e0ONwRjTZIz5mjHmhDHmuDHmdRtwDP+e19ELxpgHjDHJjTAGY8xnjTFjxpgX\nVNsV+22M+RQ/5yeNMe9cn14vxRXG8D95PR01xvy9S0TI2266MVwNN+yFzsm9/hzAuwDcCuBDxphb\nb9T1rwFlAB+31t4K4F4Av8n93ogl+H4bwHH190Ybw58C+K61dh+AA6CxbJgxGGN6APwWgEPW2tsB\nRAB8EBtjDJ8Hpc3WqNlvfj4+COA2Pub/8PO/3vg8lo/hIQC3W2vvAPAygE8BN/UYVsSNlNDvAXDa\nWnvWWlsE8GVQGbubGtbaS9baZ/n3POgl0gPq+xd4ty8AeP/69HB1MMb0AvgFAH+pmjfMGIwxjQDe\nDOAzAGCtLVprZ7CBxsCIAkgZSn5dB+AiNsAYrLWPAZi6rPlK/X4fgC9bawvW2nMAToOe/3VFrTFY\na79vbVi9+ilQER/gJh3D1XAjX+g9AIbU38PctmFgjOkHZZ5cdQm+mwj/C8B/xtLqBRtpDNsBjAP4\nHJuN/tIYk8YGGoO19gKAPwIwCOASgFlr7fexgcZwGa7U7436rP9rAP/IvzfkGDwpukoYYzIA/g7A\n71hr5/S2lUrw3QwwxvwigDFr7TNX2udmHwNIsr0LwF9Ya+8EpZBYYpq42cfANub3gT5OWwCkjTG/\nrve52cdwJWzUfjsYY34XZF794nr35VpwI1/oFwD0qb97ue2mhzEmBnqZf9Fa+3VuHuXSe7iWEnw3\nCG8A8F5jzADI1PVWY8zfYGONYRjAMBcoB4CvgV7wG2kMbwdwzlo7bq0tAfg6gNdjY41B40r93lDP\nujHmXwL4RQC/ZsWPe0ONweFGvtAPA9htjNlujImDCIcHb+D1XxEM5dz8DIDj1to/Vps2TAk+a+2n\nrLW91tp+0Lz/wFr769hYYxgBMGSM2ctNbwPwEjbQGECmlnuNMXW8rt4G4mQ20hg0rtTvBwF80BiT\nMMZsB7AbwE/WoX9XhTHmPpAp8r3WWl1gdsOMYQmstTfsPwDvBjHJZwD87o289jX0+Y0gVfIogJ/y\nf+8G0Api9k8B+CcALevd11WO5y0AvsW/N9QYABwEcITvxT8AaN6AY/ivoJq8LwD4awCJjTAGAA+A\n7P4lkLb0kZX6DeB3+Tk/CeBd693/FcZwGmQrd8/2/72Zx3C1/3ykqIeHh8cmgSdFPTw8PDYJ/Avd\nw8PDY5PAv9A9PDw8Ngn8C93Dw8Njk8C/0D08PDw2CfwL3cPDw2OTwL/QPTw8PDYJ/Avdw8PDY5Pg\n/wPaMYn0Dh565wAAAABJRU5ErkJggg==\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x116a08b70>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# show some random training images\n",
"dataiter = iter(trainloader)\n",
"images, labels = dataiter.next()\n",
"\n",
"# print images\n",
"imshow(torchvision.utils.make_grid(images))\n",
"# print labels\n",
"print(' '.join('%5s'%classes[labels[j]] for j in range(4)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2. Define a Convolution Neural Network\n",
"\n",
"**Exercise:** Copy the neural network from the Neural Networks section above and modify it to take 3-channel images (instead of 1-channel images as it was defined).\n",
"\n",
"Hint: You only have to change the first layer, change the number 1 to be 3.\n",
"\n",
"\n",
"```\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
".\n",
"```\n",
"\n",
"**Solution:**"
]
},
{
"cell_type": "code",
"execution_count": 61,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"class Net(nn.Module):\n",
" def __init__(self):\n",
" super(Net, self).__init__()\n",
" self.conv1 = nn.Conv2d(3, 6, 5)\n",
" self.pool = nn.MaxPool2d(2,2)\n",
" self.conv2 = nn.Conv2d(6, 16, 5)\n",
" self.fc1 = nn.Linear(16*5*5, 120)\n",
" self.fc2 = nn.Linear(120, 84)\n",
" self.fc3 = nn.Linear(84, 10)\n",
"\n",
" def forward(self, x):\n",
" x = self.pool(F.relu(self.conv1(x)))\n",
" x = self.pool(F.relu(self.conv2(x)))\n",
" x = x.view(-1, 16*5*5)\n",
" x = F.relu(self.fc1(x))\n",
" x = F.relu(self.fc2(x))\n",
" x = self.fc3(x)\n",
" return x\n",
"\n",
"net = Net()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2. Define a Loss function and optimizer"
]
},
{
"cell_type": "code",
"execution_count": 62,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"criterion = nn.CrossEntropyLoss() # use a Classification Cross-Entropy loss\n",
"optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)"
]
},
{
"cell_type": "code",
"execution_count": 76,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"datas = list(trainloader)[0]"
]
},
{
"cell_type": "code",
"execution_count": 90,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"\n",
" 1\n",
"[torch.LongTensor of size 1]"
]
},
"execution_count": 90,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"inputs, labels = datas\n",
"labels\n",
"torch.from_numpy(np.array([1]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3. Train the network\n",
"\n",
"This is when things start to get interesting.\n",
"\n",
"We simply have to loop over our data iterator, and feed the inputs to \n",
"the network and optimize"
]
},
{
"cell_type": "code",
"execution_count": 93,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"data1 = list(trainloader)[0]\n",
"inputs, labels = data1"
]
},
{
"cell_type": "code",
"execution_count": 96,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"Variable containing:\n",
" 0.3008 -3.3488 2.4630 1.9695 2.7805 1.7066 0.4348 0.8702 -3.0786 -2.6789\n",
"-1.2626 -2.5829 1.0306 2.4920 1.0897 1.8180 0.0347 0.6267 -1.9659 -1.4591\n",
" 0.7857 -4.1608 2.0853 3.4615 2.6721 2.6390 1.4969 1.0916 -4.5905 -3.3486\n",
" 0.3140 3.9323 -0.7381 -0.6111 -1.4288 -1.6650 -2.6042 -1.9534 -0.1365 2.4425\n",
"[torch.FloatTensor of size 4x10]"
]
},
"execution_count": 96,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"net(Variable(inputs))"
]
},
{
"cell_type": "code",
"execution_count": 71,
"metadata": {
"collapsed": false,
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[1, 2000] loss: 1.199\n",
"[1, 4000] loss: 1.195\n",
"[1, 6000] loss: 1.170\n",
"[1, 8000] loss: 1.198\n",
"[1, 10000] loss: 1.187\n",
"[1, 12000] loss: 1.198\n",
"[2, 2000] loss: 1.129\n",
"[2, 4000] loss: 1.120\n",
"[2, 6000] loss: 1.116\n",
"[2, 8000] loss: 1.108\n",
"[2, 10000] loss: 1.108\n",
"[2, 12000] loss: 1.125\n",
"Finished Training\n"
]
}
],
"source": [
"for epoch in range(2): # loop over the dataset multiple times\n",
" \n",
" running_loss = 0.0\n",
" for i, data in enumerate(trainloader, 0):\n",
" # get the inputs\n",
" inputs, labels = data\n",
" \n",
" # wrap them in Variable\n",
" inputs, labels = Variable(inputs), Variable(labels)\n",
" \n",
" # zero the parameter gradients\n",
" optimizer.zero_grad()\n",
" \n",
" # forward + backward + optimize\n",
" outputs = net(inputs)\n",
" loss = criterion(outputs, labels)\n",
" loss.backward() \n",
" optimizer.step()\n",
" \n",
" # print statistics\n",
" running_loss += loss.data[0]\n",
" if i % 2000 == 1999: # print every 2000 mini-batches\n",
" print('[%d, %5d] loss: %.3f' % (epoch+1, i+1, running_loss / 2000))\n",
" running_loss = 0.0\n",
"print('Finished Training')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We have trained the network for 2 passes over the training dataset. \n",
"But we need to check if the network has learnt anything at all.\n",
"\n",
"We will check this by predicting the class label that the neural network outputs, and checking it against the ground-truth. If the prediction is correct, we add the sample to the list of correct predictions. \n",
"\n",
"Okay, first step. Let us display an image from the test set to get familiar."
]
},
{
"cell_type": "code",
"execution_count": 65,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"GroundTruth: cat ship ship plane\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAB2CAYAAADY3GjsAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztfWmMHdl13ner6u2vX+9sNndySM4+mpGokWSPFVmyk7Fs\nS0ZiKHLsREkUCAhsxDaMxHL0w8g/Aw6M+EecYGBtXmB5IMmWIjuOpNFuaxnOKs1wHe5ks9n78vaq\nuvlxzq1zemVzODPNbt8PIPrxvnp1l7pVdc75zmKstfDw8PDw2PoINnsAHh4eHh6vDfwD3cPDw2Ob\nwD/QPTw8PLYJ/APdw8PDY5vAP9A9PDw8tgn8A93Dw8Njm8A/0D08PDy2CW7rgW6MedwYc8oYc9YY\n89HXalAeHh4eHrcO82oDi4wxIYDTAH4awBUATwP4JWvty6/d8Dw8PDw8NoroNn77KICz1tpzAGCM\n+QyA9wNY84E+NDRkDxw4cBtdenh4ePzjwzPPPDNprR2+2XG380DfDeCy+v8VAG9bfpAx5iMAPgIA\n+/btw/Hjx2+jSw8PD49/fDDGXNzIca87KWqtfcJae8xae2x4+KYvGA8PDw+PV4nbeaBfBbBX/X8P\nt3l4eHh4bAJu54H+NIAjxpiDxpg8gA8C+OJrMywPDw8Pj1vFq7ahW2tjY8yvAfh/AEIAn7DWvnSr\n5/nkbz4CADA2zdryORqWCeR90+m0AQBx0pXj8nkAQJLKb21q+bdJ1haE/F23Qt9BvsvlWwCAUC2F\nCSyfN87aujH1kaZGBm8iHpO0tfl7dRRSnpsxhucic0gSnquaf8Dj66h51XkojY6M/fefPI/lmJyc\npDHFMnbX72uBV3Uuu+yvbgrc/+XLQBpVx24N1bWGW2vZJ+t5ba02dnf8yMjIiu8++XW1vgmt59TE\n9ayp3aK9c+iuw1lbX28NAJALZUz5HG3AvG7jvR0ZGW8SNwEA1UqOzyHjjfhz6DYzgJmZaQBAT09P\n1pbL5fi8cpwJ6Ldx2snaglVEucBQY6PeoHNEck8Ui0UAQKcj54j5niwVS6ov6re/Jm0Of/zxP8o+\nV4eO0m/DfNZW66kCABbatMfr81NqvHz/qU0R8SRKUUHGGfKYA7UP3DKqpiRNVrSl3LakD16DgOe1\n2h4y6hoaN840WeU4+W2hQGPOBzJ2WPps8nLtGlMnVpxnPdwOKQpr7d8C+NvbOYeHh4eHx2uD23qg\nvxbosHRlbVMaWTItoJI1BaC3VhQpydtJGfplnKPGtpYkUv4tS36hvAAR8TlMKlIz4jb3KX2lfI6O\nKWZtSUhv1E4qJ+wkAZ9PfmtY0i/mnFQm4lEQsTbQVf2bmKcl53DSaBiubyUL9eReB9yOtK/XJLtk\nLD2mWqexTqWSNqc1GYiELme5fQl9NVTLIj0Flm6Vdl3a0g5JssW8nLdSouMi1ZXbR4VIxlnKB/yd\nzKeduONoj+Vzap/w+aJIrq+T/AMtIfL5CnmRfN2WqTdkj7kz59Vxlu+xgDvLKQndSf7ddjtrc/u4\nVFBS5jr7I7Vy78RhP50vJ/d4EpKEHuRYQm8uytiSOo9Dzte2dFw3kDVs8RoroR2dLmlSgbo3mg16\n3uj7xc1Ra9BBQJ8tazeB1rJ47eJY7WseilEakpPy+/v7s7ZCqYfPL9cudXu8IJNMFqu4FfjQfw8P\nD49tAv9A9/Dw8Ngm2HSTi3XEoxVVzjIBZRJRW9IuqTxhSd5BTr3UVgZHRuSVbhbbHJ8jXHIMIOqS\nsStJOaMIGxuSuthMRJe7PkXqWL0jv11cpLbQSh89RSbFmNCrlYUwKhVormmgCKvMvKLUQf7bTddP\n1eBMCK9XrdiNnneJecP9RhG/zsJinXlFyRbtLq1JpPXrhK+1Wa3/dJW2jWG9+URGiGVnNsmH0leO\nifdCIMcV+XtNaLabZJoJQ0XeRbQHuu2W6oNNbTG1WSO3Z8LmpXxO9k5malF7zRHEiTIhNhrU/9TE\nRNY2MkTqvyNMASDMU38h96XX2ll/InV8m+/TSM2123VrsfLRElhZp4THnCiSOzE072IP/XZwvxDV\nwdwMAKDaEDNMp0XPjKQqppy0tw8A0KPMYK7fQDtZtOl+0w4VxSKTkpq85/3h9rPe1+58cVfmlZ1O\nbf98RPu4VFLkMZwJUa5TCkfKKjn7Fk2cXkL38PDw2CbYdAk9SlgyD5WEzNJFIYzVgfymUm/ZjKBQ\nb9TYSbBKksjl6c2488DdAID52cnsu8kpkl5ykUjjAZjsjGV5mrYMADhxUaQcWxgEAHRDIXY6LC0s\nzk1nbVfHWboo0vmSsdnsu307qd/BHi29OVdGmb8TOBK70h1KYzVJ4tXiNZPys7HL+WzqXOlIpOkq\nYunMuXMAgJGdO7K2lEnu4QEhlopMHqW3Mc711imvXF/TmDVEJVHlmIzLqbYgof2Uz8n1NGHCx3fU\nb+kap0ZpZim75rZY8lf7qsXzLyvtLnR7XEmZThuqt0Tyf+aZZwEAXdYUAKC/9lbqoyD3kxO0Mxda\npckGTqK02lGANQpNditX3+WIIe6VAWjfp0rjabO2FvLfimI2a2W+1s8+nbV1JklaH33g7qzNTND9\n1zaydlWe2EKznrUVeT4FK1pgMMikrCJF3SOmXabzRl3Za2GXz1uRa1iYm6Pj9t6XtTX6emnssVyT\nhK9dMZXnjrMSBIkitJNbk7m9hO7h4eGxTeAf6B4eHh7bBJtucnH6uIn6pIXV4FhHTzLx1IlFvckz\nyZQkWuVL3EnkOGZ03vZTPw0AeOYfvpt9d43NL3VlXokTUr0uXrmRtZ2/QmlqCv2jWduekYPUZ0FU\nyQ6ribmqJCKLW6QaTt24BgAo9w9m311ZpMjDllKbR3pIDSznhBRNuqQuBzexLqxGir7ekaI3N80w\nyZsTVTJhH/PmIpkZZudEHR6fJHNVqUfU5kGOhgyMJsWZvDY3IUWdGeomo1yOvHImt3yOnL4AbC4M\noaJyuS0HuXZdVrUTFWUc1mgtjFXxB+zrnHJUMhQBvzhPZrpqWQjAgNfdRWwCQMRR1rMNMa9Mz9Pn\nkvKD7/CQO11Zuyjv9g61JSoqO+b7rqP6yrNp0Kq9mybrmQRVXIEjKtX8k5jny3YOo8whLUPXP5fK\nnjBDZJJrLMgzoXv+NI3XiGkq5SWr59TYeMz5rqxx5zJfM7UmzvGixabUsKXWi7tt75RxNq9z9K6R\n+9/0DtH81Dp1eR/llGnYRZSHytQX3eyGXwYvoXt4eHhsE2y6hN4O6M0711ARYyzR9FdF8qkxsRQp\nadARVUvcjJiU0RFdjQaRkl/70hcAAOOzImWML9JxF6/OZG0Xr1Ga97AoUVpJSDk6KjV58+bK9H2k\nclkUWIIsBjKfyQ5FpY3u2QcAaCly5tw5ktCnZ4UwCXfTeQ8MS/85dtszydqkEyBRfja9NXnU6sNX\nEQqcZB6sIqEnSvJKU+dKqvPwkBQ2MTWftc3Xab5Nl7ejIdckKBABXW+K5FUtszSqxubk/Y0qILeq\nqRSMSEoJuzDmVFSiczkMlIRuU25TLodRsNK9LzQcgZjIvN3Cx+wUkCiydXGB1u6SdnOMXJS1jGlv\njdZOuyi+8OKLAICH7r8/a0t5LO1E1rjo3HtZU2g2lDYcUR9xVyT/MKK+uipvULtN3/egF8uhNemU\n97PVMiU7NHRYek8imVfvAq/XsLgylnbspzHZOTkHu17aoZ1ZUzNH542uS24Y5+tcV/euHSHNOZfK\nmFqs8VdYW+wsyPzbvIZRSZGYvK+jQSH0TY41HivaQA9vhVC53MaG1t8Eyl0Xtxb57SV0Dw8Pj20C\n/0D38PDw2CbYdJPLRJNUiumukKLf/PtvAADuOypmi5+8n4iFfuWv7ggYnXQnYHUlUWSL49HOXyT/\n5ummqD62PAAACKvKR3aA1NtSn4ypw369HUXA1fppfLWqjPPGdTKhzM+IH3oPq4FFjhS7NCN+8Lka\nqZA3xqTCVPX6AgBgp0pBWnKpenUSsVVQ56RDUBGlEa+PVW0hJ3lyf3UKUGd+CdKV7/tAU4tswlhU\nZgBHkJZUYqcWR9KNKZPLjRn67JJydZUtpbFAJPKNSVnDK1fHAAD3HTmUtd11YA/NQflGZwSt1dF2\ny/5CzHTBOlRpqPyGU0dKK1K+OcfzaYsabjmZU1iS+ef5muVVYi3TJbNbon4Ljow2TMZa5aNer5NZ\nYXxcjq/UqtynSk7G695ZlOOK7BM/MSvxD8/+iMwwlYKM6fAhWtuITT/txkL2XYmTyKVtSaKXMHmb\naAtBy13jUayAWmqXvjbVkc88DWeaySmTV+HsGTr9M9/O2uK3srlKpaC1HC+SX5D7pAVai6qK/wg5\noVhaUVG2ltNxq0R5PYP0DMhdZXPNokSq5kb4mXFZTDkRX5PWxIvSF5tm06Pim97ixF6Bep7kYzbh\nqHvB3oTvXw4voXt4eHhsE2y6hB71klTQmJJ3SzdPxON0QyV675DbUC0vUkvqIim15BnSG7rVEel2\ngl/kkwv0Ni73idtg/zARlfVUpMch0G9DRZh0ctRvqy5SS2uRfrN/RM7XYGn8RkckGcMS0tw0S00q\nsq7Jb/wwX87axueJoB2bEwlx/xCnNr3JG3u2SZOtllXqYc4loQt2ZMK3I2d0ulcW0c1qVRBWcYe8\nPiaVBwcGSOMpFYUoardo3uWCtO0cJo3LpQWuN2SuFZZeOi1Zw5AnvqjSt8ZZVKxsY3GhVC5yZnmL\n/Gc9j8uiTkvroviUhF5gzaCqCOhel4JWuWEW+HoXtTDKmlTQEkk6K3bABVM689JXT4W+6+f1BYDz\nV0gbPHdZim6cPvsUAGBmUqTRxRadp9GV+jMR2A2xLoTig3dT0Yn3/ezjAIDdal+3izSHVl3m1alT\nvzVVjN403f0h0ZsOuVBFZfI6OXIUkPSxEcuZ1RnpK75CLr81FYG7cI367xSFgLWg54S5Li7HlV1M\naNaU5Avab6VF5YY5S2NvKTI6niTNMM9rGM/LehWmyVGi21TurSV6ns2evyznLZGE3jO6P2vj1FCw\nykWx7UhxtZ91kZuN4KYSujHmE8aYG8aYH6m2AWPMV4wxZ/hv/3rn8PDw8PB4/bERk8unADy+rO2j\nAJ6y1h4B8BT/38PDw8NjE3FTk4u19lvGmAPLmt8P4F38+dMAvgHgt1/NAO5+6FEAwJXvncraqr2k\nwj36jrdlbeWQSMOOMnk4U4JRKUUTS8pCz469WdvzLxKhUu0jNX/3fvHHtazm5nLKlNMmkqPTUZGq\n3Feo1KGXXngBAFBTFUbKFVLvKmXxIb92fRyAJA4Lldo4wH7DszOi5s1M0+fzY6Le7Rohv9pImZxW\nQ1QjNTlR5pKuq0OpSCb32ZFTgV1pXrCrOKRrf3UX5KYjFbPkTcq808c+vF2V2AisfpeZjNYmF8MR\nwEbZgQol56MrbTGz3UuII0foLrGvcFTeysPWtblcvnAh+9zl9M0L87L/ki7N++pVMTnN8F6oL4oJ\nb8cgmUmqFYnyDLnyVkelXo04iVzAieLqyhzTchOysv8uXSNy/fwVIY/rHfptsVf5QVdogXTtmwpX\nTBq7eDpru3aN9um3v/33AIB7FQE93EfmheaimHJczc/uvWJeWeQ0t4/d/04sRyEv87fO/JIqP3w2\nXQX8d1FVbFo89iYAQC16S9bWWKBr0VWOEqbA66Nq7+ZK1G9d+dy7eIquSn6V4/ukmVcJy/hvk4na\nxqJc/wqft6WOL1RplQd6xGiR8LNjsaR2IPvGl7ryW2dC1CEk3VtMPPdqSdERa+0Yf74OYGWFXYYx\n5iPGmOPGmOMTKtjBw8PDw+O1xW2TotZaa8yqVQfc908AeAIAjh07tuK4ci9JlPsPHc3amiys7jso\n1dSHWLqbPX8ha+u6iLJYCMVH3/kL9NtDx7K2gw/Sb555jiTq/qpEkV27QVJOZIWwK7jCCmq0i0wG\nzU6Li9JANbf8MCQshQ8NC1HkCjZMzpDEbVQUZQ+7PEahIkJYMnvl8pWsbbifpLcje8S9cjV84k/+\njPpQRHGOJYRqj0hIhw8SGfzWh8iVSqX5yNwbdY4W6yREFW0ZsxSuibp8gfqwioLM50niHuwXktvV\nr4yYAM0rN0fkWPJREYizTBTPzonWsjBH0mK3oerR8lYcHBSX0yOHSdLM5TV5yocvFeWX4Nv/8D05\nrVlZHKXJEb8Xrl/L2rLan2o9+3s5ylgRxQU+LqdcGSN2pQu4pmijJRJlxOewqkjG9Wki1LvKvbTc\n4+Yta+dcGLWLZqtFY6/11LK2t7/lQQBAnVM/t1QK3kuXaP1feeWVrK3J7nUXp2T9mw0672O/gBWo\nVESTjnktuom+drS2LqWyySu33REiPufrorVMzNH8jXJb7nDd1LwmFmfpNzo3VCFP98S8uk+KnAcH\ngfzWXe+2i2RW9YPnmpxfSikZZY5u7dkjFoIsQ7DKy+LyEC2pZeGeJDoa/hb9Fl+thD5ujBkFAP57\n4ybHe3h4eHi8zni1D/QvAvgQf/4QgC+8NsPx8PDw8Hi1uKnJxRjzFyACdMgYcwXA7wL4PQBPGmM+\nDOAigA+82gGEBSIRro2fyNoefgtVU6n0iiklXCDiKVFRVBGr0OcuC1HxWD+ltEV5T9bWUyGVqxhR\nXyXl811kc4D2Dd+9i6LcXlbqZZ4JnfkF6evgXjITHb1HIsCmp7k6UU1U/mvsE2uYdOnrFxPFHJsS\ndDKrUpl+21SJgM5cWuCxr/8ObjK52GmKupxjc8aCymFU5rbk3nsAAC2rK+dwNRel8jotcEnVITa/\n9A6IeSlL3qXrN7qEXXkxObjwXadQpspwdYEjeq/eEMVveopMXc2milRss8qrkni55FB79gqts28v\n7YVKXm93R/yubXJ5/sy57HO5RKYuq6pItWPqq1elQ3bEX6clPtQTi3QtQmWu6imSqS1WKWoNk4Ah\nOymbSGIJCnUyEXS6QrZOTzsyVMcG0N+OSvq1UKc16zSlbe8w7cHBfjE/umjU6Rniugb7xER37E3k\nSHBFxRzMcZT3yStihgyCtZNJRYrkLPXQPbioaoRGbKdKnD+6iqIMeH+mkGttuKJZpPp0n7oduXdK\nbEKNlCnFmbpy6rcJm/hcrVIAiHmH5kpMWCYr03LrZF65mM7XUd4DLgVvMdGpl2P3ZQYXNa3vcHOL\nSZ834uXyS2t89Z5b6snDw8PD43XFpkeK5opEyrRaWsoiqSWnJOlyxRFLKlUtv6GrkbxRP/XExwEA\nP/8vf0364Ii2PNdPDFSV9oOHdgMAbkwLsdVaJOlq546hrM0VCWh3ZJyHDhNpe9dhIXTnnqP6jfUF\nkTwckRNzVFxTSc99XG8wsSL59/aTRBF3ZJxhQHO8cm19uuID//xf0DgVUVjhHDJGSXIllladEDQ/\nL5JfGvP6RyKhReyiZRWJ12RXPpvKNnKV0B0RCwCRk4ZySmphKcxJ+do9q8W5T1yuEgDo57w6iar3\nWAxpXrNTonpcuXoBAHBYEeohS2Zau3DS8npeYfNLcmrQvMrKHbXEkvSevXdlbV2XKvi6RG9OsnYx\nMiKuhIUh0hrqsyLdphwN29tP2kWhIK5vLZ52I5brVOR7IunKXguZWMwr8jSXp/XvFuXaPfpmkriP\n7t8lfXRo359/heb1yqmXs+/e8VYiTPfuleMvvUiuxF0leabrpHfO56X/PNfXTVVK2RKT4TGnKl5Q\nkbIJE5/FXtFuRyrsIKCIQ7fHtWQbsswbquIoS0j4ZbAqetVJ6Em4tPgHAAT8Oa9T3HIfbfWMcd1G\nKuozAV0nndLZ8H2ko7a15r4R+FwuHh4eHtsE/oHu4eHhsU2w6SYXwxFjjUUhkVpsLsjpRDxTTFqG\nYobJgfyQR/tE5TlzgqJCr105K500yJxy8coFAMAjOx/Nvtq9n0ihXTeERKufJVVyoCDEZg9Hmb7y\nyvmsbXQXmWtmlbmiy+ra+IRSpV2yK/Y1byiTi+HkPJr6qLh0vKmQbXnDxNbkdayHlOshhupd7Van\nmheSrVSktW1yutNGV0jhC+cuUJ+KFN13kBILnb8spqkv/R0lguqqCitFTsBVLsq1cxF1vTXxee7r\nJXX5kUceAgAMD4l54a49tK6BUWQX662asHIkV3OHqOG7Ruma7dot6VtdOtZGQ8w1mRlqHZEmVxDz\nyvAOMjUUFSk9OUlxAnUVvezC/FoqArR3mPbYbmUG6uml+daGxAwzxYR6wqq3uiSZz3tDkYidrjOr\nybzyLlVzQa51jgnFHWr9h/vpc1ERlcNs6qmxj/bUpUvZdxdfuQAA2DkgZsi5cfLTzylSvBOu/UiJ\nVCKqkFMDF1XCrtkbRPJOL1LM4sSYxGH099C98MB9D8q82PzaVqbELpt8Ap2wD87UKneZMw1qk4eL\nu0iWkLH8fXY+fQ4e+5Jz0G8j1b/bx4G6T3JsEsvpG9+ldFZ+9ck6cRKrwUvoHh4eHtsEmy6huzdf\nqMiG0SF6G2sp72svkgthfyzHHRmgN16xIG/+fETS78SNC9JFmySffXeRS2OozluukaQ0NCJujlMc\ngTc3L65Prhzijh0iUUWsQbQUeelyczS16xP/2P1ttYXsiWN6pw4qSc1wbcG8EUm+wERRYkVDWQ1/\n/X++DABIVZL+gF29qopk7mFp7cARmvfwoEijg6MURTqgxlTkPCSzJ6QQxw9PUIrQpmIWHWcaKamp\nxr89vE/Sh77j0TdTX0xsVZRk54SijlrXmN3wGnOSS6TLLn+lspC3fX0kmY5z/hwAmORCGSUVqTiy\nk+ZWLtM1HKqtjMDt7xNpNOTxtVUxDxftNz0lY5qf53TISrsMObrw4lUZU22epOveXtECnbtimx0E\njJH5F1wUY0WuYcm6yFIlxfF9VCkpTZaLvewZFKm9zARlfV7GHrP07+K+DyqN4sRJcuE8elSlxWVp\n+No1cWUsKpfc5dDSsCu6kir31gV2CZ6YIC10dkbOe/rFHwAATr7w3azt8GFyFz5w+N6srX+INW0l\n2WZpo1XRE/dtuMTNkgtM6EIkLr8KE5qpqovqjg/V8W7XL4myXoV5d8RrrI/L+pTrrp8tG4GX0D08\nPDy2CfwD3cPDw2ObYNNNLi5iq7cq6nBfDxNWKgXrvCV1cXJGVKmhHhp+JS9kQ8I1HS9cu5C1jfST\nr/d+VtFaqiznD56hCNWrYzNZW0+VzDC5nKjyL511BJG8A1P+3FZq0SJH5fWphFUx2xDGxsmHvNIj\nFVYi9m8tl0VFdsms0BViNanT+EZ2rJ+c6+nnqA5JKacqBrXn+bwy9re9naJxL14ls8nUWPYVHrif\nfJTzJZl/g81EOWWuevObidBsqQjEPJsGjhw6mLXdz+lVdw2JeaFWpmucsnnh8nXJxHljhis2TUpb\nnUnzWVUXs8Ppa3XSLZccTEcUd9kMVu6TtXsANMdeJmcP7RRiz0GbTRocjRqqPHQh++knKgVrxNHI\nqVLv8wXqY2hIiNoq7/eiWuNeHnvE185qso19veNYNm8v++kHgfYDp3FGVqeDXuTzq/PFbR67XLsO\nRzk2+ZqU1T69eJ324suvfDlra3N90W5L9r8N144U1XBmimJR5n/P3RS1fPheIsUbC+IA8NKzFN/x\n3HFJmPbtb5H578TLWe0dHL33YQDAkbvFDNPXT/sur/ZJmI1Tk47pKm20tl1XAzVeWdNXR48mfK/r\nyOf1aE2zJPLakacyzjhdJ1BiFXgJ3cPDw2ObYNMldBext3OH5JRwNQVTRSyO7iGJ77iSvGcNSVU2\nFJfH3iF6k/bWlItQkSSkAyyhV3vFHfCTn/hTAEBD9TXfJBKt0ZTzOk5qZ7+ctzVNEkJdkbK9NdIk\nTp46k7WNj5OkOc/Ro319suy1CklZoZU3f44j9sKGkELDFfq+t7i+G9PEZXa5HBA3wD17iAC876Ej\n0gdLay89T2TTiJKUqhxteGNSxPZKjaS1wZoc977HqYhBoHz/envpuKFBWeNpTjl8/qKsydwsaQ3z\nc0SELSgCepZTFU+r+o0xk7y5nKx/notJBCqarrdG8+rrE22gn7WagtaCmDRcVC6kyzE4LHvSuYNW\nVZGClFO/5gJZkx3s3mhUpGye3euc9gAARY6UDFWeXSeRZ4U9lITu3DYbdXFbdJGKGWEKwLK03pgT\n7e7qBVr3aeUj11ei34yoNMPFoqvHSxK3jURDicpEok9cEbfVvaN0//WoQjDz7bVJvFRFSrqcLzbQ\nbUwysitj36CkoH3sXbSHDx8Wze873/wGAOD8eblP6s/RfTyvyN4HH6LiGHv3yvlcuuoklns3ccSn\nsgzYZSltdabwrFZtoMlWfnbp+rH8vSZHXV9YQoq6366U+DcKL6F7eHh4bBNsuoTu7MU1lfUtTmhY\nBSUhHOWCDMefETvofI7cqlIjgR0ju+nt/vIJcW/6sX/y7wAA3+WCBfW6CgTqUIGLG9elSrd7zy2q\n8lARB2/0B1Lua3eJzjM3IZJnHJJkPLJDlaBi9y6Xw6XVFGm0znbaOBXJq9uigIodOcnHsqtK0lM7\nVgUBVsHV01TZfV7lQfn5f/ofAQCPPy751L76NbKF7mC78o6yCjpiN7iiCrAY4QILPb0SnFJkd8FY\n2QudFBqrfBjXT5EEdemGuO11uGBJxFkHe3qEc9jBkmK3s9JemVN8ictzofNd9PTQfGrKDTFkiXdR\nFUcYH6fr3nJl3rjEmUa5LHPtsl25pMrI9dVIakyXZAAl+3epKv1ntlHlmplabtMyFX905ncLlVOE\nr3ucyBzmp7g4ixpzjiX0xTnhH8aukVQ9MqACuyrkktlQ0nXK2kLMZ7QqR8tuLthwtypL9/B99Pn0\nObl3nvuhZE1dDqNcBAMuQBGoPEy50AX20HFGzT9gXuHI0YdkvOzyOzb2uaxtZpLmeqYt2t34VSpv\nedeRe7K2e++n8+wYEV4j4udN3BX+qRs7d2GS5K0Kdlu1OAprTatlSVyS2ZPXQp/CFZbRmpkORtoI\nvITu4eHhsU3gH+geHh4e2wQbKXCxF8CfgApBWwBPWGv/0BgzAOAvARwAcAHAB6y1M2udZy24vCX9\nQxKVF7NTuAx9AAAgAElEQVQ61gpE9SlWSV106WYB4NJlcmt67K33Z22tRVJ5yj2SZnbsKpkwzp6m\nCuexrv7NGlRdEXA9g6SGzc2JettbJVX77qOSS+LpF04CAJ49IfldHvvJ9wJYmvr33Fkyycwy8Zeq\n92irSaaW/SOiopc4GnBAqcg2ItUv7qzvxtTimo4PvknG+e73vBsAMNgnROWPv40JTVbRe5SLXo2v\nSaiqtLuK9Fa7yHEE6tyMEHA1VltTlVL00N0PAAB27JE0w9MzZK7qYfJSp2A1bHPQxQccoabrXC4y\nQWhVcRJXMOHymBC6zsTVbchvXX6XckXmvRx1RZj2cIGLUJlNbnC+nnkVvZpymt3DKqKyj/OfhDml\nrvP6aNNUh4tTNpgUb7Vl/8UdWi+jCmLYNh2v3Xb7+sh0VcqLG2bERF5fVfZkLxeY6LTF5NHgsXc4\nfXWgIhZdXdRyQfbuFSbgQ7Ul779biPflCJaQvEyAKpIx79KmuOueriQMO8oMt2fvAQDAgQMHsran\nx+m6x8oMNnGDrs/EpBC6J068CGBpNOxdd9HYR0Z2Z209znWTo7dbHUWi8r2YU4VbHPGp3RYd72mV\nCVOg9z2T4urb8BYLXGxEQo8B/Ja19j4Abwfwq8aY+wB8FMBT1tojAJ7i/3t4eHh4bBI2UrFoDMAY\nf14wxpwAsBvA+0Gl6QDg0wC+AeC3b3UAqSvjNSAkXr3J2fGU1OaIL1dODABOv0SS71xD3nzVCpGn\nquYALp4mSeIqk0PveIdkW3TZ63p2yVt5YBe5Rl2aPpm1NduczL4i5F1tmIiiR3pkTBMstV24+LzM\np0GS7CxXKd8xLNJTr6Ux7a+K9LyjxgUhjLhNusx6FbP+G/vQPRRY8cF/8x9kjglJF6fOCimZssRR\nZPK0q9yjpmdZCkl1LhvqXxVTRwqS7hbmhZQOx0mCuqbKx7mCJakKQKkwCXvuDGlP51VmP+fyNzAk\na+Ikybk50aSmJokUtCq/RsBucEa5w7nMin1FIX6LLtvk4tokc0G5SE5N0nxemZnM2lxQTl+/EOCj\no5RLpKMCULodkvRTK+OcZ02qqQjyhIN9QtaC8ioTopPCiyofTYndFVsqA2PKRGKlKveTk4bzKujH\n3U+aZG4xAWjClaRkl4uZXJkSJbzBJesi5bywc1TuheUIlYSafY6Vm6PhNbOuPKCWaM2S7wAJSurp\nEU02IypXyaJoVPnAhRm6ns+p7KUvvfA0AGBgUK7nzp17eV4HuE+xEAyyJj88Ig4dzuVUX+uYNchY\n5avK3BbV7WxYQ9IFNmy6mlS/Nm7Jhm6MOQDgEQDfBzDCD3sAuA4yyXh4eHh4bBI2/EA3xlQBfA7A\nb1hr5/V3ll6Bqxp3jTEfMcYcN8Ycn5iYWO0QDw8PD4/XABvyQzeUz/VzAP7cWvt5bh43xoxaa8eM\nMaMAVi12aa19AsATAHDs2LEVD/0FTiJSUqRclj5U1ap0EVpDA6KGnw4opeeNaTFNTIVcl7EqatA9\nD5CadO4CqfW6cIAjKo8cETLnyEGy11wcE/X+pZd+SOefVNGGXAChX/kcX3mJzDRjk/LOM0zuhhyx\nOrpXfHn3s8q1r0dU6SLXI2y3lI9wSqpxN14/neYv/vK/ojHtFNX3hR+RWUMTSh1W5RIm56yqXO6I\nGF2DNGE10Ko2CZBTeVM4vfHklJh3nA+1soKgr9bHYyIzw/SUXEOwyj85KaRkm01OsSIqE67vGqoc\nHeUirXVB+aaHrhK7TuLDNR21X/lyzCqy99pVMo1VVPrae7jYgk4zXOYcNS0VZTwzQ7EL3a4iIDnX\nSlml/u2t0T1QKdDfkjKHRGxCSBQpGscdPq9s6BbvHbOkEEPIv1XXiT9GoSL0Uo6T4BTBUxNiXppk\nn3eX4hYAZjivTkXFMBR65P5cDmO1yYX7VGYYw2aKLL+JNi/yZ01ANhdpLNevCwF+7Rp9nivLcTne\nT47sB4AKm2vKkRzniPKrqrDGmQv0jGk2qZhLnIjZamiYooIffPC+rO3IYTLRDA/Lnqj1EileKCkn\nB/B1VyaVLGhVRV53XmtS1FBC4I8DOGGt/QP11RcBfIg/fwjAF26pZw8PDw+P1xQbkdB/HMC/BvBD\nY4xj+v4rgN8D8KQx5sMALgL4wKsZwLmz9Abcd0SyoxUDkjzSjhBWEb9RdXa2nh6SkKuqtNY995C7\n2Fe//LdZW2OOiI/yIJn5z14RZWLvHiJRD9795qytwBLfoX37srZZLg/28gmJCk2ZZLkyI26Q80zo\nthLROOZnSQvYwQTLxSkhwgb2kqQ6VVDucymTqEoatxGX20pFylsNzz1/HADw4g+FlDWg34aq3Jcr\nzuEyBgI6ApOkkEhlZ3TrvjSXChdYUKXqQkvf1/JCLAWsyXRDJUly9KzzLsurPCvdBrvvqYjeDhOG\nRhXucCpCR0meCUeD1hdkncp8PYd7hSiMWDLOrxOINzAstFA/S+GRIhbdnlxYFFJycZHGXCjIiR2h\nqDP17RohYrxQFAnRkaGWc4nUW7L/W0w8z85IpPLUNJkwm0obuJczW+ZULhsp5iDSniNA26p83hWO\nlp7gLJedjuzrBufXmZsVrTXPLpx6/k997WsAgI/951/HCig3yJRdEm0s+96Rhk5ZNEu0B1daUSTa\nF559BgCwOCOm3EF2x7w8Jm01drnMR7JPU9Yaa1XlSslupflIFQfhUn5hQPOfmhEX1QvnKSp7dkYi\nZZ89zsVplMvvXtbId43K82R0Fz0Ldo1IW4WzvJqSyg0TrO1Wuxo24uXyHaydAfI9a7R7eHh4eLzB\n8JGiHh4eHtsEm56c6/mzZP7Y94D4hqcg9cZoApBVtHlFyszOElEzOPBw1vbex38SAPDwmyQRz5Of\n/ys6HyfW6e0Vc8DuXUQeVmuqtmNM/Q/slOUZPUjq8pwqSPDs82TWGFtUPq85ImB7R4UcGjpMbVlB\nBOXzfYoLd5y9ruqiMmPUVFGRdV6KOF2/gMC3v/kVAEBDpQ/N50iFLJV1cQxO32o5EZN6twc5Z3KR\ncRYLXJBBFbjIcxKtqCJzLeZprgWVVMhliDUq9a8jubtcOKOlyM7MRKEj6/h4Xas0C/NV5o2+Cn3u\nrci1q5aYKM3J+XLs82yStU1Y3SU+zzTvSKXFTdhEoKMdIyZjddKlIptVmnUxYTQ5bXBTtnNm4gpy\nzh9Z9v+pEy8DAC5euJC1uYhnq3yed42SM8BAr/hLNxuNJX8BYJZNB1OK+G2yidMlk2vo4+fJlBSo\n9S9HtMYu+RcAXL8uft3L0Y1VMQ0m6E0s18lFprqrZCHzdyTq4qIsmCuscvdRMde++eFjAIBnXpSi\nF997mlJEzy6KaSphQnnH6K6s7bHHHgMARGqPX7hIMSzf+x4l+3vgPolKr/Eaj6s5j4+TM4DbwwCw\nkxOAHTx4QPpnJ4P6gpiwnMNBLhLytrVKgrr14CV0Dw8Pj22CTZfQT8+R5DeZqHSjOZLWgo56e6Wu\nPJNIqLtGiaj6iR8TQrOYI2nl4H6J/PzZX/wgAOCzf/U31Nd1Oe/YnMsRcjZry7NkMN0UCeHsRX4L\nqzemHSYtoH9ESBSXw8EYVQiBJdnUkKSm85bMcRRnUZWMK3L62roRCanLZKRN139jjwwTATTWFFIo\nSUgaq6myeBGPb36SyN6FeZFeuokj8ZT0ulrEGkvhuZK4aNkc9R+rkNKARfSyym/jqtInXB5uSUUA\nzhditIbAxGZJSU8DPSTJ7FVuo3tGyUVMeQOi3SKpLrCiBUQs8vXVhChbjtOnJRXs/fffx/3LdXJL\nEiiKKeWowHEVKevyBLWbQnImLi2rksIPHT4AABjeQXNI1JrnWDPoU+mLHaGqq745l8OTp05lbS7n\njfsOEPfXVGkhddZ+GzzORkNFKrMmpYtpXOLCLbosYLJOyTS7pJiDS3Ai37soT+dxmhodRUp/Soo8\n/4l3vYe/Ummumag9+rBo/A+8hcotBlq54xPqQiyHDpG7cqSu8YEjlGZ31z4im0sl2S+umIuelyvm\nkqhozx1cKKVHlfQLWbsJlLtwwg4PXaXepWbt9VwNXkL38PDw2CbwD3QPDw+PbYJNN7mcmqV3yhe+\n88Os7eH9pHLuzAs5UGZVb3SnRICODpH6edchlRCII/DGJoTs+cRnyNTyzPNELLlIVEDlBlJV2i2b\nHJKCqLcJmxciiMoVM8kaByrK062oIj5bHSa5WJWKIrEHhKxWW5W4KmZaKKejNzl6rNNdP3LMdjnZ\nWUXUxgUmV7uJ+Avfcy+ltLW7SOW8oaICb3BU4OKskG2OINMmApvQeSuRqJL3vInSkV6bE/JqYp7M\nOs2O9N/kSkEuKrWgIoUrbF7qU4mohrly++guuf6Hd5Of+I6C2BwW2Xd9elpMTiGTjeWKkOFVjswd\nVImYlqPbUgTcIpkVApV62ZkNdCWihH3Nz5w5nbUtcHrdvDJXuMpO2q895VDBgKNtoUxzg2wu02Rr\ng1MvN5uyrpcvX1lxnAs8tKr2ZYMThmlzSX2STEM5HmesfP5jjqKsKz/0mP3fE5UcbY0MIDxOMTmF\n7FcfWZl/h+/dmKN441jMFq4PXZfUWTpitSddfc+OSqm8ax/XIU0VKc+fA3Xfn79EPv5NVcXJna+n\n9+CK/mfmqN8okutaqR3gwalkd3M072vjEkPg/PALKkW4C4I1VTlfa2btmrerwUvoHh4eHtsEmy6h\nL/Ib6qvPikRz+hWKHv2Zt0iOhLt2kRR4/pxEar7zrSRlFlX04kKH3vhP/t3TWduzL5NbVcNFpSkJ\n2bmIpYrMce5TVok5Cb/x20pq7rLUYIxIMm2OuNRESRQtrX1ZVnkm8iyNKA4FCROKmliJmTzM94h7\n5WqYukYSWtKVN3uTpabGZUlRO8BRo8OcUjaniimUOOlKU1UusFnq0ZXSWKMp0v07udjI/fdKgY1L\nl8j1a2pWUq+2mWRzZGikyO4Ss1dDigDtq1S4d+n/+iTN59Sk5PIwTGjVdgjZVeL6ouUe0fhcat6q\ncu9bjlIk17rDUrAjrAGpkRmE2uWTrl1N1XQtshtoVeWBCXluZRX57CTiMycpH9DctEh0cxzRmSgX\nxVye3UtV/wUW84xiABsccXpjWrTWBhOkoVr3/l7Or8MaXUP5VMacLyZdIo2vTFVrzNoy4re+9XWZ\nT0wFJioqKjPhPdhlKbirtCGXw0bfV13WhhIljTuysdVWhSgSlz5X5prjlL8DfVJYp1p1xVZkDlLm\n0yz5CwABS+96zgE/zyKVIyYwK4/L0tVoz1zDz5OyOl/r1hIaegndw8PDY5vAP9A9PDw8tgk23eQy\nOERJiqZnRJUa4yi2f3hBKgYl3f38SVSZYU4Ra0JRzX9wnCLE/uZr383a2imrdaxmBcHK91jSVmQX\n61mpUuWcqqejPHOs3hntCMwJhaIl1WHoOJdMLFT9h5bVRkXOpC5RljK5jO4k00BPbW0TAQDsHCXy\n7MolSQGatNlcovx6z58mP+U59g3XK1JnX/e6SiaVZsSTNk3RWnTaopo/+50vAwDeVRGTwwM832av\n+Is7AtBFA7c6YiKa4+hNR84CwMWTFIE32ZSEXa0c9V/aIf71/TtJbS7UlHmDI0XL2oebU76acO1b\nQMc8JEzQuWhjPYe2mr8jRUuKKAvYJNisi193e5rMgJd0tSFXMYj3mk6E5oj0XFH6D7iLTkdX4iHz\nSqsl520xAa3p9CJfk25T9n2XUyk78lKTmI4MNMoMGfP6WGWiyOfWJu2LOZWALaTPoUqRXWAnhNSZ\nHBUBGXC/On1zyknMlpoy2DRkZe+6VMLWatMQjV3d4gi4Rm4Uynq2uVKWI0f1Irq6pd2uimh1kcJq\n7zgzzWrPnY6KfLV8npZ6nBTCqeU/WRdeQvfw8PDYJth0Cd1Jsjnltha3SKI6Py7SWLtOUXvvfLNU\nji/1UY6EOVUI4pvfp/SxTVU/0JEnBXYV065HOl+FQ8gSwpLynSwYFJREZ5yIFKi2AudNURFlzq3J\nvckXlKTmIuvaykWrt5/dNkcVYcP+kE2Vy2Y17DtK6TjnVerZ+hUn6SpXSia3prnfvIrs7PDaaQJu\nNXc0Y1e2nXmR8mZcXhDJbzigNdGEVsLSyiITsNetSINnmRy7oiJVG2XWcvZJ7o2Rg6S1FftE8s6u\nhSIKq1xfs1wTDSHg/WbXIfHmZ0VDaCyQ1njjmmiIrRaNL1HjdDk8tNTmNL4gVNodRzRHinjN0hYz\niapdD7tMCup8MO027aOFOVk7t+0rNeUay2ttu7LH2ou0xq5IBgDMsTTqJHNNNjopN7UrI4Z1fhuT\nrl2ARUceL9aJIC8r7doJ0AnLmbqYS6frxqvc+LiYh1XSuKQqlv0cMymaxGo+fN11pKwToK2VcbZb\nLr9NsuJ4d13tEkeBhNvkuIxQVUe584QdPXbOodMv+3R0r86/dHN4Cd3Dw8Njm8A/0D08PDy2CW5q\ncjHGFAF8C0CBj/+stfZ3jTEDAP4SwAEAFwB8wFo7s9Z51oIjlnSkZhqSutiBsAPji6QGPXtKUnW+\nt0Fqy4IVM8TVGfpcrAopFzfoPC1WKV3dRwCIctGS7wDlX2y03yqnmVXmFZdyNlcQ9XaR/XU7sZhV\nnPnFmRy0eaXOUatV5Q/bz8l8OkodPsm+yTnN4qyCWj8RhMMjkjBrjE0uS1Q+/ttms0pXWU+cr3ey\nTtQfoIww2jTA6np9UvxngwIRlaFKDnWN+3getO5nI7UmVVLhK3slinN4FyVbG1RVhArs191R47Rs\nEihEipTmz6Emql1yJE1oL8P1CxIb4Srm6KhIRxBGKn2vCVf6K+fZvKOTk7nvtfkvZhPD4iKp4Z22\nmBxStkcERvpP2U87XxD/+pHdu/gcEtE5P0O3ZawqEFlHwKqL1+g4s4YzZejMWVhxfM4l04I2Ya5t\nErx8WVLanhmjsVRUjdDImfqyPmRdXTRoqpLT5QvBijZnpkl0Djte61CZ4Vz65kDbtdz1VGZVd31c\n3ESa6ChSd0107eMc/27lnlxlOdFVKYKTAdofu1WN0l65tBvCRiT0NoB3W2vfBOBhAI8bY94O4KMA\nnrLWHgHwFP/fw8PDw2OTsJESdBaA84HK8T8L4P0A3sXtnwbwDQC/fcsjcG8yXRGcoxhTFdnlcqmc\nvyESwCeepLqh737Xsazt/DWSDOs62stJ0i7dqJIKyvzWzqvCFc0Fkq6XEFssVeeKsmRO8lvqtsTu\nUOoN3WTXNNemJcU+lqgHOQk+AExMUYTg7KQkzp+9SBGyhw8dxHooceRnQUVZ5jiXSaJIMTe6OEvP\nqcguu+zvGsh+oaTRRZ7jyY6Qzb1cc/Rkazxre4k1mCl2LxzcK/MaPUjSeN+ouCMW2A0yUPk4uq7A\nhIrKC1kajtQ1dpLUEunauZKtQ4qGqXLbY8ksVURddl6ltQV2JaHeZjfMuCtam5O4l+ZBITgSPZeX\naxgy8RhpYpn3ZLGgXP9K9JvpKdE4XVrcnHKly3IDKc3U1fS0WQroVaIiddEP3v+LqphKoy6awXIE\nVu6xnIuUTEQKdxpB5oYYKrdFlt6XaFksGWvu3q2/Vdq1uxh2qY8igKUaknPQiNWe6HK/KT+TbKAl\nb/6rpXwejFlCijIprQjwmCPUayo30Z4HyeEjMnLdZ09LjquNYEM2dGNMyAWibwD4irX2+wBGrLUu\n5vo6gJE1fvsRY8xxY8zxiYlbC2P18PDw8Ng4NvRAt9Ym1tqHAewB8Kgx5oFl31usIc9Za5+w1h6z\n1h4bHh6+7QF7eHh4eKyOW/JDt9bOGmO+DuBxAOPGmFFr7ZgxZhQkvd8yBvuIMGupVKV1jl7LhypV\nLauXgfJX/+YPKMHPeVXTcLZObMj0oqjLjlussNoeKzWrUFipohdLpDbpxEURVxRK1DswZvOCWUKA\nMKGoUo922De2xImYdJWUgSEytXQUKdzm6jzNgqqOw1GD9ZbMazV0mTyqq8RKPX3Ub6su6rWLwktY\nHUz069glM1qff80i76wikersG/xtVW3qYoPaplTSoWhkLwBgdA+95A+ql/1gL61PoKJN6ywvtHT9\nTlb5dZ3TIkeARnlR74tcHamgEmHpKMy1kCpmzZkDrNX+5Tx/xSg7E4omoF2UYaLNBbyf3P4DZL85\nolZfEmcu0EnXEiagOzllLuSUtnUVr5BFoOal/xbHXyypIuT8sN24lcnFtUUq2tF2aD/NTIkprdtZ\ne3/Gyg894eM6gSaFXcI2JqzV0ynl+ypQ43UpclN1TTLTmEqi50hpbV1zv9GRr+77VNfx5PM5U5OO\nAHemHKNMQ2CCNq/NNvx86KqU1gN3U3Wk3Qf2Zm0trkf6ysnjWVupKxG/G8FNJXRjzLAxpo8/lwD8\nNICTAL4I4EN82IcAfOGWevbw8PDweE2xEQl9FMCnDSU/CAA8aa39kjHmuwCeNMZ8GMBFAB94NQNo\nscRZUK+WNktGuVDeaDG/GHWS/qBEEtyFa8pFjqW2WElNTrpvcVrQuorUdGSPlpQqeZLeSoooDViS\n0GRjqUz961waE5zyNFXuSBETIP01kh53DkgK3J07ifibVdLzPKeZXZwTsqmPCxxMqkIUq6HLBFyY\nl/n3D1O/3apaTyZIHU/a1YUDWEJXwlPmrrZEanOfI51zhNq6Jemr3Utjv6tPaJb+AYrurNZoC1bL\nIvkUmHhuqci+jovAU5J16ApGaAaSP+eUxuXI65wqMBFmUvDazG9Lufk5otKqvjLXRzV/V+xC5+1Y\nLnnzAGi4WuLla+DcBhMVddnltQiVhtblPCCJ0iQrbdJuNHnrcu60myrKcpXan+myyN841ns4x8OW\nvqbHSSnvtuV+WhJdvRx6+pzzJVD7NMf3CZxDg9JaQ/7xkuBtl25W5WgpshbSXxNC3dUP1cVZ3NqG\nKnq3wBqxy9FC/S11L9XRswtcpEOTsinvhTmVFzcaorHsPypR7v0cDX71pNQynjxLacMjdd2L6+TG\nWQ0b8XJ5EcAjq7RPAXjPLfXm4eHh4fG6wUeKenh4eGwTbHpyLqcGFpTqw3mYkHZFvXRupanyl3aJ\nglKly8UdJq8SZRrICC2XFlen5aR32sy0BLlOc781VeGml/3FayrarAgyySSpmEsiVgNDVeeyzUmc\nXLWbSLGNcWOO/6rERbNTPH9R+YscjdhaJ7IREFW2b1AIxWqFSbm2MkOxrSVOnO+x9jlmX2r1vg+y\nFKA6fSt9Hym1uczmjR61diNcCaZaEJK7wr7peZ5XR3GUi+w339QqMpNMRWXeyIfOX1tVh1nNlMHX\nvaPIrnyeyavc2uupI4DdvHPa5OeSXikCzK2i0daLdCWhCjaN6RSxLmraVS7qKJNP0yWJaop/f8yk\naEWZpkq9pMrHOukTRyMHq9hDtAkNzqzg/KuVOarC+74+L/fJvPM/V3MNgrUfKWGs1rrj7l3Z9xY0\n5hCufq+KwM0ia1UcgLFL/gJAyon4GpEkpzPZntXpsLn/rrS1us6soyNK+bdumZZEVPMaq/4d8V1T\nKZ2Hj1KMRaCeXaee/j71eUNMqCFfR12BarkZ7GbwErqHh4fHNoGxt/gGuK3OjJkAUAewPrN352MI\nW38OwPaYh5/DnYHtMAfgzp3HfmvtTQN53tAHOgAYY45ba4/d/Mg7F9thDsD2mIefw52B7TAHYOvP\nw5tcPDw8PLYJ/APdw8PDY5tgMx7oT2xCn681tsMcgO0xDz+HOwPbYQ7AFp/HG25D9/Dw8PB4feBN\nLh4eHh7bBG/oA90Y87gx5pQx5qwxZktUODLG7DXGfN0Y87Ix5iVjzK9z+4Ax5ivGmDP8t/9m59ps\ncF7754wxX+L/b6k5GGP6jDGfNcacNMacMMa8YwvO4Td5H/3IGPMXxpjiVpiDMeYTxpgbxpgfqbY1\nx22M+R2+z08ZY/7Z5ox6KdaYw+/zfnrRGPNXLhEhf3fHzeFmeMMe6Jzc638C+BkA9wH4JWPMfev/\n6o5ADOC3rLX3AXg7gF/lcW/FEny/DuCE+v9Wm8MfAvg7a+09AN4EmsuWmYMxZjeA/wTgmLX2AVC6\nqg9ia8zhU6C02Rqrjpvvjw8CuJ9/80fGmPVDnN8YfAor5/AVAA9Yax8CcBrA7wB39BzWxRspoT8K\n4Ky19py1tgPgM6Aydnc0rLVj1tpn+fMC6CGyGzT2T/NhnwbwC5szwo3BGLMHwM8C+GPVvGXmYIzp\nBfBOAB8HAGttx1o7iy00B0YEoGSMiQCUAVzDFpiDtfZbAKaXNa817vcD+Iy1tm2tPQ/gLOj+31Ss\nNgdr7Zet5GT4HqiID3CHzuFmeCMf6LsBXFb/v8JtWwbGmAOgzJMbLsF3B+F/APgvWFI8dEvN4SCA\nCQCfZLPRHxtjKthCc7DWXgXw3wFcAjAGYM5a+2VsoTksw1rj3qr3+r8H8H/585acgydFNwhjTBXA\n5wD8hrV2Xn+3Xgm+OwHGmJ8DcMNa+8xax9zpcwBJtm8G8L+stY+AUkgsMU3c6XNgG/P7QS+nXQAq\nxphf0cfc6XNYC1t13A7GmI+BzKt/vtljuR28kQ/0qwD2qv/v4bY7HsaYHOhh/ufW2s9z8ziX3sPt\nlOB7g/DjAN5njLkAMnW92xjzZ9hac7gC4AoXKAeAz4Ie8FtpDj8F4Ly1dsJa2wXweQA/hq01B421\nxr2l7nVjzL8F8HMAftmKH/eWmoPDG/lAfxrAEWPMQWNMHkQ4fPEN7P9VwVDezo8DOGGt/QP11ZYp\nwWet/R1r7R5r7QHQun/NWvsr2FpzuA7gsjHmbm56D4CXsYXmADK1vN0YU+Z99R4QJ7OV5qCx1ri/\nCOCDxpiCMeYggCMAfrAJ47spjDGPg0yR77PWNtRXW2YOS2CtfcP+AXgviEl+BcDH3si+b2PMj4FU\nyRcBPM//3gtgEMTsnwHwVQADmz3WDc7nXQC+xJ+31BwAPAzgOF+LvwbQvwXn8N9ANXl/BOBPARS2\nwrHn5J8AAAB8SURBVBwA/AXI7t8FaUsfXm/cAD7G9/kpAD+z2eNfZw5nQbZyd2//7zt5Djf75yNF\nPTw8PLYJPCnq4eHhsU3gH+geHh4e2wT+ge7h4eGxTeAf6B4eHh7bBP6B7uHh4bFN4B/oHh4eHtsE\n/oHu4eHhsU3gH+geHh4e2wT/Hyb0/o+hemXvAAAAAElFTkSuQmCC\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x12a3575f8>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"dataiter = iter(testloader)\n",
"images, labels = dataiter.next()\n",
"\n",
"# print images\n",
"imshow(torchvision.utils.make_grid(images))\n",
"print('GroundTruth: ', ' '.join('%5s'%classes[labels[j]] for j in range(4)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Okay, now let us see what the neural network thinks these examples above are:"
]
},
{
"cell_type": "code",
"execution_count": 66,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Predicted: cat plane plane plane\n"
]
}
],
"source": [
"outputs = net(Variable(images))\n",
"\n",
"# the outputs are energies for the 10 classes. \n",
"# Higher the energy for a class, the more the network \n",
"# thinks that the image is of the particular class\n",
"\n",
"# So, let's get the index of the highest energy\n",
"_, predicted = torch.max(outputs.data, 1)\n",
"\n",
"print('Predicted: ', ' '.join('%5s'% classes[predicted[j][0]] for j in range(4)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The results seem pretty good. \n",
"\n",
"Let us look at how the network performs on the whole dataset."
]
},
{
"cell_type": "code",
"execution_count": 72,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Accuracy of the network on the 10000 test images: 59 %\n"
]
}
],
"source": [
"correct = 0\n",
"total = 0\n",
"for data in testloader:\n",
" images, labels = data\n",
" outputs = net(Variable(images))\n",
" _, predicted = torch.max(outputs.data, 1)\n",
" total += labels.size(0)\n",
" correct += (predicted == labels).sum()\n",
"\n",
"print('Accuracy of the network on the 10000 test images: %d %%' % (100 * correct / total))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"That looks waaay better than chance, which is 10% accuracy (randomly picking a class out of 10 classes). \n",
"Seems like the network learnt something.\n",
"\n",
"Hmmm, what are the classes that performed well, and the classes that did not perform well:"
]
},
{
"cell_type": "code",
"execution_count": 68,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"class_correct = list(0. for i in range(10))\n",
"class_total = list(0. for i in range(10))\n",
"for data in testloader:\n",
" images, labels = data\n",
" outputs = net(Variable(images))\n",
" _, predicted = torch.max(outputs.data, 1)\n",
" c = (predicted == labels).squeeze()\n",
" for i in range(4):\n",
" label = labels[i]\n",
" class_correct[label] += c[i]\n",
" class_total[label] += 1"
]
},
{
"cell_type": "code",
"execution_count": 69,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Accuracy of plane : 64 %\n",
"Accuracy of car : 65 %\n",
"Accuracy of bird : 50 %\n",
"Accuracy of cat : 43 %\n",
"Accuracy of deer : 50 %\n",
"Accuracy of dog : 46 %\n",
"Accuracy of frog : 61 %\n",
"Accuracy of horse : 59 %\n",
"Accuracy of ship : 60 %\n",
"Accuracy of truck : 54 %\n"
]
}
],
"source": [
"for i in range(10):\n",
" print('Accuracy of %5s : %2d %%' % (classes[i], 100 * class_correct[i] / class_total[i]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Okay, so what next?\n",
"\n",
"How do we run these neural networks on the GPU?"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Training on the GPU\n",
"Just like how you transfer a Tensor on to the GPU, you transfer the neural net onto the GPU.\n",
"\n",
"This will recursively go over all modules and convert their parameters and buffers to CUDA tensors."
]
},
{
"cell_type": "code",
"execution_count": 70,
"metadata": {
"collapsed": false
},
"outputs": [
{
"ename": "AssertionError",
"evalue": "Torch not compiled with CUDA enabled",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mAssertionError\u001b[0m Traceback (most recent call last)",
"\u001b[0;32m<ipython-input-70-7b748db2b046>\u001b[0m in \u001b[0;36m<module>\u001b[0;34m()\u001b[0m\n\u001b[0;32m----> 1\u001b[0;31m \u001b[0mnet\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcuda\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/modules/module.py\u001b[0m in \u001b[0;36mcuda\u001b[0;34m(self, device_id)\u001b[0m\n\u001b[1;32m 149\u001b[0m \u001b[0mcopied\u001b[0m \u001b[0mto\u001b[0m \u001b[0mthat\u001b[0m \u001b[0mdevice\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 150\u001b[0m \"\"\"\n\u001b[0;32m--> 151\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_apply\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcuda\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdevice_id\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 152\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 153\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mcpu\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdevice_id\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/modules/module.py\u001b[0m in \u001b[0;36m_apply\u001b[0;34m(self, fn)\u001b[0m\n\u001b[1;32m 120\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_apply\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mmodule\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mchildren\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 122\u001b[0;31m \u001b[0mmodule\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_apply\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfn\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 123\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 124\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mparam\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_parameters\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mvalues\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/modules/module.py\u001b[0m in \u001b[0;36m_apply\u001b[0;34m(self, fn)\u001b[0m\n\u001b[1;32m 126\u001b[0m \u001b[0;31m# Variables stored in modules are graph leaves, and we don't\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 127\u001b[0m \u001b[0;31m# want to create copy nodes, so we have to unpack the data.\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 128\u001b[0;31m \u001b[0mparam\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdata\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mparam\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdata\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 129\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mparam\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgrad\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 130\u001b[0m \u001b[0mparam\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_grad\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdata\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mparam\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_grad\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdata\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/nn/modules/module.py\u001b[0m in \u001b[0;36m<lambda>\u001b[0;34m(t)\u001b[0m\n\u001b[1;32m 149\u001b[0m \u001b[0mcopied\u001b[0m \u001b[0mto\u001b[0m \u001b[0mthat\u001b[0m \u001b[0mdevice\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 150\u001b[0m \"\"\"\n\u001b[0;32m--> 151\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_apply\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcuda\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdevice_id\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 152\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 153\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mcpu\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdevice_id\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/_utils.py\u001b[0m in \u001b[0;36m_cuda\u001b[0;34m(self, device, async)\u001b[0m\n\u001b[1;32m 49\u001b[0m \u001b[0mdevice\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 50\u001b[0m \u001b[0;32mwith\u001b[0m \u001b[0mtorch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcuda\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdevice\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdevice\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 51\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtype\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgetattr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtorch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcuda\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__class__\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__name__\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0masync\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 52\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 53\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/_utils.py\u001b[0m in \u001b[0;36m_type\u001b[0;34m(self, new_type, async)\u001b[0m\n\u001b[1;32m 22\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mnew_type\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mtype\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 23\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 24\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mnew_type\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msize\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcopy_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0masync\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 25\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 26\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/cuda/__init__.py\u001b[0m in \u001b[0;36m__new__\u001b[0;34m(cls, *args, **kwargs)\u001b[0m\n\u001b[1;32m 252\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 253\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__new__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcls\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 254\u001b[0;31m \u001b[0m_lazy_init\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 255\u001b[0m \u001b[0;31m# We need this method only for lazy init, so we can remove it\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 256\u001b[0m \u001b[0;32mdel\u001b[0m \u001b[0m_CudaBase\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__new__\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/cuda/__init__.py\u001b[0m in \u001b[0;36m_lazy_init\u001b[0;34m()\u001b[0m\n\u001b[1;32m 92\u001b[0m raise RuntimeError(\n\u001b[1;32m 93\u001b[0m \"Cannot re-initialize CUDA in forked subprocess. \" + msg)\n\u001b[0;32m---> 94\u001b[0;31m \u001b[0m_check_driver\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 95\u001b[0m \u001b[0;32massert\u001b[0m \u001b[0mtorch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_C\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_cuda_init\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 96\u001b[0m \u001b[0m_cudart\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0m_load_cudart\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/Users/stevenhao/miniconda3/lib/python3.5/site-packages/torch/cuda/__init__.py\u001b[0m in \u001b[0;36m_check_driver\u001b[0;34m()\u001b[0m\n\u001b[1;32m 59\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_check_driver\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 60\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mhasattr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtorch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_C\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m'_cuda_isDriverSufficient'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 61\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mAssertionError\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Torch not compiled with CUDA enabled\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 62\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mtorch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_C\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_cuda_isDriverSufficient\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 63\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mtorch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_C\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_cuda_getDriverVersion\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;31mAssertionError\u001b[0m: Torch not compiled with CUDA enabled"
]
}
],
"source": [
"net.cuda()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Why dont I notice MASSIVE speedup compared to CPU? Because your network is realllly small.\n",
"\n",
"**Exercise:** Try increasing the width of your network \n",
"(argument 1 and 2 of `nn.Conv2d`, see what kind of speedup you get."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"#### Goals achieved:\n",
"\n",
"- Understanding PyTorch's Tensor library and neural networks at a high level.\n",
"- Train a small neural network to classify images"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Where do I go next?\n",
"\n",
"- [Train neural nets to play video games](https://goo.gl/uGOksc)\n",
"- [Train a state-of-the-art ResNet network on imagenet](https://github.com/pytorch/examples/tree/master/imagenet)\n",
"- [Train an face generator using Generative Adversarial Networks](https://github.com/pytorch/examples/tree/master/dcgan)\n",
"- [Train a word-level language model using Recurrent LSTM networks](https://github.com/pytorch/examples/tree/master/word_language_model)\n",
"- [More examples](https://github.com/pytorch/examples)\n",
"- [More tutorials](https://github.com/pytorch/tutorials)\n",
"- [Discuss PyTorch on the Forums](https://discuss.pytorch.org/)\n",
"- [Chat with other users on Slack](http://pytorch.slack.com/messages/beginner/)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
}
},
"nbformat": 4,
"nbformat_minor": 0
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment