Created
March 23, 2023 03:48
-
-
Save gt3/01782a443d7245bc74091019e4b776cf to your computer and use it in GitHub Desktop.
Understanding Code with Graph Neural Networks
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"cells": [ | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "XiP6-MBFmKAB" | |
}, | |
"source": [ | |
"# 0. Introduction\n", | |
"\n", | |
"In this notebook, we take a closer look at how to apply Graph Neural Networks (GNNs) to the task of graph-level prediction on the ogbg-code2 dataset. The prediction task is defined as: \n", | |
"Given a graph representation of a program, specifically the body of a method, generate the set of tokens that form the method's name.\n", | |
"\n", | |
"\n", | |
"\n", | |
"The dataset is a collection of 450,000 abstract syntax trees (ASTs) generated from Python GitHub repositories. The dataset aligns well with our task as it also incorporates nodes and edges from the AST as well as the tokenized method names from which the AST was generated. The training, validation, and test splits are provided with the dataset." | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "cDdFlkbqQYfq" | |
}, | |
"source": [ | |
"# 1. Install Dependencies" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 1, | |
"metadata": { | |
"id": "Be3f5csQWOBY" | |
}, | |
"outputs": [], | |
"source": [ | |
"# Basic Python dependencies\n", | |
"import os\n", | |
"import time\n", | |
"from collections import defaultdict, namedtuple\n", | |
"import random\n", | |
"random.seed(2)\n", | |
"\n", | |
"# Basic data handling libraries\n", | |
"import numpy as np\n", | |
"from tqdm import tqdm, trange\n", | |
"import pandas as pd\n", | |
"import copy\n", | |
"import json\n", | |
"import matplotlib.pyplot as plt" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "hh8eI-xgRcK9" | |
}, | |
"source": [ | |
"We'll be using [PyG](https://pytorch-geometric.readthedocs.io/en/latest/) (PyTorch Geometric), a library built upon PyTorch to easily write and train Graph Neural Networks (GNNs) on structured datasets.\n", | |
"\n", | |
"Next, we will load the [Open Graph Benchmark](https://ogb.stanford.edu/docs/lsc/) (OGB) dataset from the ogb package. OGB is a collection of realistic, large-scale, and diverse benchmark datasets for machine learning on graphs. The ogb package not only provides data loaders for each dataset but also model evaluators.\n", | |
"\n", | |
"_Note: This cell might take a while (~5 minutes) to run_" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 2, | |
"metadata": { | |
"colab": { | |
"base_uri": "https://localhost:8080/" | |
}, | |
"id": "pOCAIKchQfK2", | |
"outputId": "4ca1b8d7-cc15-42e7-e88b-0af60cb483be" | |
}, | |
"outputs": [ | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/\n", | |
"Looking in links: https://pytorch-geometric.com/whl/torch-1.13.1+cu116.html\n", | |
"Requirement already satisfied: torch-scatter in /usr/local/lib/python3.9/dist-packages (2.1.1+pt113cu116)\n", | |
"Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/\n", | |
"Looking in links: https://pytorch-geometric.com/whl/torch-1.13.1+cu116.html\n", | |
"Requirement already satisfied: torch-sparse in /usr/local/lib/python3.9/dist-packages (0.6.17+pt113cu116)\n", | |
"Requirement already satisfied: scipy in /usr/local/lib/python3.9/dist-packages (from torch-sparse) (1.10.1)\n", | |
"Requirement already satisfied: numpy<1.27.0,>=1.19.5 in /usr/local/lib/python3.9/dist-packages (from scipy->torch-sparse) (1.22.4)\n", | |
"Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/\n", | |
"Requirement already satisfied: torch-geometric in /usr/local/lib/python3.9/dist-packages (2.2.0)\n", | |
"Requirement already satisfied: numpy in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (1.22.4)\n", | |
"Requirement already satisfied: requests in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (2.27.1)\n", | |
"Requirement already satisfied: jinja2 in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (3.1.2)\n", | |
"Requirement already satisfied: psutil>=5.8.0 in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (5.9.4)\n", | |
"Requirement already satisfied: tqdm in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (4.65.0)\n", | |
"Requirement already satisfied: pyparsing in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (3.0.9)\n", | |
"Requirement already satisfied: scikit-learn in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (1.2.2)\n", | |
"Requirement already satisfied: scipy in /usr/local/lib/python3.9/dist-packages (from torch-geometric) (1.10.1)\n", | |
"Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.9/dist-packages (from jinja2->torch-geometric) (2.1.2)\n", | |
"Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.9/dist-packages (from requests->torch-geometric) (1.26.15)\n", | |
"Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.9/dist-packages (from requests->torch-geometric) (2.0.12)\n", | |
"Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.9/dist-packages (from requests->torch-geometric) (3.4)\n", | |
"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.9/dist-packages (from requests->torch-geometric) (2022.12.7)\n", | |
"Requirement already satisfied: joblib>=1.1.1 in /usr/local/lib/python3.9/dist-packages (from scikit-learn->torch-geometric) (1.1.1)\n", | |
"Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.9/dist-packages (from scikit-learn->torch-geometric) (3.1.0)\n", | |
" Preparing metadata (setup.py) ... \u001b[?25l\u001b[?25hdone\n", | |
"Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/\n", | |
"Requirement already satisfied: ogb in /usr/local/lib/python3.9/dist-packages (1.3.5)\n", | |
"Requirement already satisfied: pandas>=0.24.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (1.4.4)\n", | |
"Requirement already satisfied: urllib3>=1.24.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (1.26.15)\n", | |
"Requirement already satisfied: scikit-learn>=0.20.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (1.2.2)\n", | |
"Requirement already satisfied: numpy>=1.16.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (1.22.4)\n", | |
"Requirement already satisfied: six>=1.12.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (1.16.0)\n", | |
"Requirement already satisfied: outdated>=0.2.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (0.2.2)\n", | |
"Requirement already satisfied: tqdm>=4.29.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (4.65.0)\n", | |
"Requirement already satisfied: torch>=1.6.0 in /usr/local/lib/python3.9/dist-packages (from ogb) (1.13.1+cu116)\n", | |
"Requirement already satisfied: setuptools>=44 in /usr/local/lib/python3.9/dist-packages (from outdated>=0.2.0->ogb) (67.6.0)\n", | |
"Requirement already satisfied: littleutils in /usr/local/lib/python3.9/dist-packages (from outdated>=0.2.0->ogb) (0.2.2)\n", | |
"Requirement already satisfied: requests in /usr/local/lib/python3.9/dist-packages (from outdated>=0.2.0->ogb) (2.27.1)\n", | |
"Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.9/dist-packages (from pandas>=0.24.0->ogb) (2022.7.1)\n", | |
"Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.9/dist-packages (from pandas>=0.24.0->ogb) (2.8.2)\n", | |
"Requirement already satisfied: joblib>=1.1.1 in /usr/local/lib/python3.9/dist-packages (from scikit-learn>=0.20.0->ogb) (1.1.1)\n", | |
"Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.9/dist-packages (from scikit-learn>=0.20.0->ogb) (3.1.0)\n", | |
"Requirement already satisfied: scipy>=1.3.2 in /usr/local/lib/python3.9/dist-packages (from scikit-learn>=0.20.0->ogb) (1.10.1)\n", | |
"Requirement already satisfied: typing-extensions in /usr/local/lib/python3.9/dist-packages (from torch>=1.6.0->ogb) (4.5.0)\n", | |
"Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.9/dist-packages (from requests->outdated>=0.2.0->ogb) (3.4)\n", | |
"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.9/dist-packages (from requests->outdated>=0.2.0->ogb) (2022.12.7)\n", | |
"Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.9/dist-packages (from requests->outdated>=0.2.0->ogb) (2.0.12)\n" | |
] | |
} | |
], | |
"source": [ | |
"!pip install torch-scatter -f https://pytorch-geometric.com/whl/torch-1.13.1+cu116.html\n", | |
"!pip install torch-sparse -f https://pytorch-geometric.com/whl/torch-1.13.1+cu116.html\n", | |
"!pip install torch-geometric\n", | |
"!pip install -q git+https://github.com/snap-stanford/deepsnap.git\n", | |
"!pip install ogb" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 3, | |
"metadata": { | |
"id": "1T0jRDVgV3iN" | |
}, | |
"outputs": [], | |
"source": [ | |
"import torch\n", | |
"import torch_scatter\n", | |
"import torch.nn as nn\n", | |
"import torch.nn.functional as F\n", | |
"\n", | |
"import torch_geometric.nn as pyg_nn\n", | |
"import torch_geometric.utils as pyg_utils\n", | |
"\n", | |
"from torch import Tensor\n", | |
"from typing import Union, Tuple, Optional\n", | |
"from torch_geometric.typing import (OptPairTensor, Adj, Size, NoneType, OptTensor)\n", | |
"\n", | |
"from torch.nn import Parameter, Linear\n", | |
"from torch_sparse import SparseTensor, set_diag\n", | |
"from torch_geometric.nn.conv import MessagePassing\n", | |
"from torch_geometric.utils import remove_self_loops, add_self_loops, softmax\n", | |
"from torch_geometric.nn import global_add_pool\n", | |
"\n", | |
"from torch_geometric.data import DataLoader\n", | |
"import torch_geometric.transforms as T" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 4, | |
"metadata": { | |
"id": "JaiQ4WZBfDW7" | |
}, | |
"outputs": [], | |
"source": [ | |
"# Use GPU when available\n", | |
"device = torch.device(\"cuda:0\") if torch.cuda.is_available() else torch.device(\"cpu\")" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "ipKWwVO1buFr" | |
}, | |
"source": [ | |
"# 2. Setup" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "ZPfG9w44qBxt" | |
}, | |
"source": [ | |
"## 2.1 Load Dataset\n", | |
"\n", | |
"The `ogbg-code2` dataset provides 452,741 different graphs, and the task is to learn a model that can predict a set of tokens that represents the method name for a given graph. The dataset has a pre-defined project split, where the ASTs for the train set are obtained from GitHub projects that do not appear in the validation and test sets.\n", | |
"\n", | |
"Download, extract, and import the `ogbg-code2` dataset.\n", | |
"\n", | |
"https://ogb.stanford.edu/docs/graphprop/" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 5, | |
"metadata": { | |
"id": "eLOsu-8-Y0Tx", | |
"colab": { | |
"base_uri": "https://localhost:8080/" | |
}, | |
"outputId": "49aeb1fa-2ce9-4c97-9aa8-3e0eab8a89c3" | |
}, | |
"outputs": [ | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"The ogbg-code2 dataset has 452741 graphs\n", | |
"Sample:\n", | |
"Data(edge_index=[2, 243], x=[244, 2], node_is_attributed=[244, 1], node_dfs_order=[244, 1], node_depth=[244, 1], y=[1], num_nodes=244)\n", | |
"Splits: Train: 407976, Val: 22817\n" | |
] | |
} | |
], | |
"source": [ | |
"from ogb.nodeproppred import PygNodePropPredDataset\n", | |
"from ogb.graphproppred import PygGraphPropPredDataset, Evaluator\n", | |
"\n", | |
"dataset_name = 'ogbg-code2'\n", | |
"# Load the dataset\n", | |
"dataset = PygGraphPropPredDataset(name=dataset_name)\n", | |
"print('The {} dataset has {} graphs'.format(dataset_name, len(dataset)))\n", | |
"print('Sample:')\n", | |
"# Extract sample graph\n", | |
"print(dataset[0])\n", | |
"split_idx = dataset.get_idx_split()\n", | |
"train_idx, valid_idx = split_idx['train'], split_idx['valid']\n", | |
"print(f'Splits: Train: {len(train_idx)}, Val: {len(valid_idx)}')" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "9ojDO44URLu8" | |
}, | |
"source": [ | |
"Load provided mapping for node types and attributes." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 6, | |
"metadata": { | |
"id": "kPkqdj4wQ7Vy" | |
}, | |
"outputs": [], | |
"source": [ | |
"nodetypes_mapping = pd.read_csv(os.path.join(dataset.root, 'mapping', 'typeidx2type.csv.gz'))\n", | |
"nodeattributes_mapping = pd.read_csv(os.path.join(dataset.root, 'mapping', 'attridx2attr.csv.gz'))" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "Hon0DGcApn2f" | |
}, | |
"source": [ | |
"## 2.2 Model Configuration" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "OOmHc6Evptds" | |
}, | |
"source": [ | |
"All model configuration and hyperparameters are stored in a single `Config` object as defined below. The most notable parameters is: `max_vocab_size` which determines the size of our vocabulary. Since our model is generating a set of tokens, we can limit the predictions by having a fixed length vocabulary." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 7, | |
"metadata": { | |
"id": "QoeuE8aDiXnb" | |
}, | |
"outputs": [], | |
"source": [ | |
"batch_size, emb_dim, lr, max_iter = 128, 256, .005, 25\n", | |
"cfg = {\n", | |
" 'device': device,\n", | |
" 'dataset_name': dataset_name,\n", | |
" 'model_save_path': f'gin_{batch_size}bz_{emb_dim}emb_{lr}lr_{max_iter}it.net',\n", | |
" 'history_write_path': f'gin_{batch_size}bz_{emb_dim}emb_{lr}lr_{max_iter}it_history.json',\n", | |
" 'max_vocab_size': 5000,\n", | |
" 'max_seq_len': 5,\n", | |
" 'batch_size': batch_size,\n", | |
" 'emb_dim': emb_dim,\n", | |
" 'num_nodetypes': len(nodetypes_mapping['type']),\n", | |
" 'num_nodeattributes': len(nodeattributes_mapping['attr']),\n", | |
" 'max_depth': 20,\n", | |
" 'num_layers': 5,\n", | |
" 'max_iter': max_iter,\n", | |
" 'dropout': 0.2,\n", | |
" 'lr': lr\n", | |
"}\n", | |
"Cfg = namedtuple('Cfg', cfg)\n", | |
"cfg = Cfg(**cfg)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "8gy3VjT_qTWK" | |
}, | |
"source": [ | |
"## 2.3 Build Vocabulary\n", | |
"\n", | |
"Here we extract the vocabulary from ground truth sequences (list of words) in training set." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 8, | |
"metadata": { | |
"id": "aC3op-nklRXb", | |
"colab": { | |
"base_uri": "https://localhost:8080/" | |
}, | |
"outputId": "d2402390-26cf-4974-ce85-f32e8cc7ab70" | |
}, | |
"outputs": [ | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"Top 5000 words proportion: 0.9566695094108582\n" | |
] | |
} | |
], | |
"source": [ | |
"# Given a sequence of words and word-to-index mapping, generate a fixed length tensor representation\n", | |
"def encode_seq(seq, v2i, PAD):\n", | |
" words_idx = [v2i[word] for word in seq[:cfg.max_seq_len]]\n", | |
" pad_factor = max(0, cfg.max_seq_len - len(words_idx))\n", | |
" return torch.as_tensor(words_idx + [PAD]*pad_factor, dtype = torch.long, device = cfg.device)\n", | |
"\n", | |
"def build_vocab(target, target_idx_list):\n", | |
" # stores mapping between words and indices\n", | |
" v2i = defaultdict(lambda: len(v2i))\n", | |
" VOID = v2i['<void>']\n", | |
" PAD = v2i['<pad>']\n", | |
" UNK = v2i['<unk>']\n", | |
"\n", | |
" target_encodings = torch.stack([encode_seq(target[idx], v2i, PAD) for idx in target_idx_list]).to(cfg.device)\n", | |
" \n", | |
" # so far we have an infinite length vocabulary which is not feasible\n", | |
" # restrict to top-k (cfg.max_vocab_size) appearing words\n", | |
" counts = torch.bincount(target_encodings.view(-1))\n", | |
" topk = torch.topk(counts, k = cfg.max_vocab_size)[1]\n", | |
" print(f'Top {cfg.max_vocab_size} words proportion: {(counts[topk].sum() / counts.sum()).cpu().numpy()}')\n", | |
"\n", | |
" topk_keep = torch.as_tensor([VOID,PAD,UNK], dtype = torch.long, device = cfg.device)\n", | |
" topk = torch.cat([topk, topk_keep])\n", | |
"\n", | |
" # rewire the indices based on the top-k search above\n", | |
" i2v = {v: k for k, v in v2i.items()}\n", | |
" v2i = {}\n", | |
" for k, new_k in zip(sorted(topk.unique().cpu().numpy()), range(len(topk))):\n", | |
" v2i[i2v[k]] = new_k\n", | |
"\n", | |
" v2i = defaultdict(lambda: UNK, v2i)\n", | |
" i2v = {v: k for k, v in v2i.items()}\n", | |
" return v2i, i2v, UNK, PAD\n", | |
"\n", | |
"# Build vocab from existing labels in *training set* only\n", | |
"v2i, i2v, UNK, PAD = build_vocab(dataset.data.y, train_idx)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"For all the examples in the training set, roughly **96%** are in our fixed-length vocabulary. This ratio provides a litmus test of whether we need to tune `max_vocab_len`." | |
], | |
"metadata": { | |
"id": "PLKtwrCRhgR5" | |
} | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "rWVhGhMVW1BN" | |
}, | |
"source": [ | |
"## 2.4 Augment Features\n", | |
"\n", | |
"PyG's transforms are a general way to modify and customize Data objects. Here we define two such transforms: \n", | |
"1. Encode labels as Tensors with `encode_target_tensor`.\n", | |
"2. Augment next-token edges with inverse relation and edge attributes with `augment_edge`.\n" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 9, | |
"metadata": { | |
"id": "29RhVmeiXkAP" | |
}, | |
"outputs": [], | |
"source": [ | |
"def encode_target_tensor(data):\n", | |
" data.y_tensor = encode_seq(data.y, v2i, PAD).unsqueeze(dim = 0)\n", | |
" return data\n", | |
"\n", | |
"# The augment_edge transform is originally defined in the official OGB example here:\n", | |
"# https://github.com/snap-stanford/ogb (source)\n", | |
"def augment_edge(data):\n", | |
" '''\n", | |
" Input:\n", | |
" data: PyG data object\n", | |
" Output:\n", | |
" data (edges are augmented in the following ways):\n", | |
" data.edge_index: Added next-token edge. The inverse edges were also added.\n", | |
" data.edge_attr (torch.Long):\n", | |
" data.edge_attr[:,0]: whether it is AST edge (0) for next-token edge (1)\n", | |
" data.edge_attr[:,1]: whether it is original direction (0) or inverse direction (1)\n", | |
" '''\n", | |
" ##### AST edge\n", | |
" edge_index_ast = data.edge_index\n", | |
" edge_attr_ast = torch.zeros((edge_index_ast.size(1), 2))\n", | |
"\n", | |
" ##### Inverse AST edge\n", | |
" edge_index_ast_inverse = torch.stack([edge_index_ast[1], edge_index_ast[0]], dim = 0)\n", | |
" edge_attr_ast_inverse = torch.cat([torch.zeros(edge_index_ast_inverse.size(1), 1), torch.ones(edge_index_ast_inverse.size(1), 1)], dim = 1)\n", | |
"\n", | |
"\n", | |
" ##### Next-token edge\n", | |
"\n", | |
" ## Since the nodes are already sorted in dfs ordering in our case, we can just do the following.\n", | |
" attributed_node_idx_in_dfs_order = torch.where(data.node_is_attributed.view(-1,) == 1)[0]\n", | |
"\n", | |
" ## build next token edge\n", | |
" # Given: attributed_node_idx_in_dfs_order\n", | |
" # [1, 3, 4, 5, 8, 9, 12]\n", | |
" # Output:\n", | |
" # [[1, 3, 4, 5, 8, 9]\n", | |
" # [3, 4, 5, 8, 9, 12]\n", | |
" edge_index_nextoken = torch.stack([attributed_node_idx_in_dfs_order[:-1], attributed_node_idx_in_dfs_order[1:]], dim = 0)\n", | |
" edge_attr_nextoken = torch.cat([torch.ones(edge_index_nextoken.size(1), 1), torch.zeros(edge_index_nextoken.size(1), 1)], dim = 1)\n", | |
"\n", | |
"\n", | |
" ##### Inverse next-token edge\n", | |
" edge_index_nextoken_inverse = torch.stack([edge_index_nextoken[1], edge_index_nextoken[0]], dim = 0)\n", | |
" edge_attr_nextoken_inverse = torch.ones((edge_index_nextoken.size(1), 2))\n", | |
"\n", | |
"\n", | |
" data.edge_index = torch.cat([edge_index_ast, edge_index_ast_inverse, edge_index_nextoken, edge_index_nextoken_inverse], dim = 1)\n", | |
" data.edge_attr = torch.cat([edge_attr_ast, edge_attr_ast_inverse, edge_attr_nextoken, edge_attr_nextoken_inverse], dim = 0)\n", | |
"\n", | |
" return data" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 10, | |
"metadata": { | |
"id": "j80XYIl_f4gI" | |
}, | |
"outputs": [], | |
"source": [ | |
"# Compose multiple transforms and apply it to the dataset\n", | |
"dataset.transform = T.Compose([augment_edge, encode_target_tensor])" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "jcjP4V9IKq30" | |
}, | |
"source": [ | |
"## 2.5 Create Data Loaders" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "78Mho4d5zVRZ" | |
}, | |
"source": [ | |
"PyG automatically takes care of batching multiple graphs into a single giant graph with the help of the `DataLoader` class:" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 11, | |
"metadata": { | |
"colab": { | |
"base_uri": "https://localhost:8080/" | |
}, | |
"id": "8knLefdXNrYd", | |
"outputId": "964a74fe-97cb-4844-8964-b71f1f50786e" | |
}, | |
"outputs": [ | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"Number of batches: Train: 3188, Val: 179\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"/usr/local/lib/python3.9/dist-packages/torch_geometric/deprecation.py:12: UserWarning: 'data.DataLoader' is deprecated, use 'loader.DataLoader' instead\n", | |
" warnings.warn(out)\n" | |
] | |
} | |
], | |
"source": [ | |
"# use evaluation metrics defined in the OGB dataset\n", | |
"evaluator = Evaluator(cfg.dataset_name)\n", | |
"\n", | |
"# create data loaders for each split\n", | |
"# shuffle = true for training set ensures data is reshuffled for every epoch\n", | |
"train_loader = DataLoader(dataset[train_idx], batch_size = cfg.batch_size, shuffle = True) # 407976\n", | |
"valid_loader = DataLoader(dataset[valid_idx], batch_size = cfg.batch_size, shuffle = False) # 22817\n", | |
"print(f'Number of batches: Train: {len(train_loader)}, Val: {len(valid_loader)}')" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "MFdpuYnKRuWA" | |
}, | |
"source": [ | |
"# 3. Create Model\n", | |
"\n", | |
"Here are the steps to train a GNN for graph property prediction:\n", | |
"\n", | |
"1. Embed each node by performing multiple rounds of message passing\n", | |
"2. Aggregate node embeddings into a unified graph embedding (readout layer)\n", | |
"3. Train a final classifier on the graph embedding\n", | |
"\n", | |
"\n", | |
"\n", | |
"The figure above illustrates a high-level architecture of our GNN model.\n", | |
"\n", | |
"For [Graph Isomorphism Network](https://cs.stanford.edu/people/jure/pubs/gin-iclr19.pdf) (GIN), the readout layer simply takes the sum of \n", | |
"node embeddings. PyG provides this functionality via `global_sum_pool`, which takes in the node embeddings of all nodes in the mini-batch and the assignment vector batch to compute a graph embedding of size `[batch_size, hidden_dim]` for each graph in the batch.\n", | |
"\n", | |
"`NodeEncoder` defines 3 `nn.Embedding` layers to encode node features: type, attribute, and depth." | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "qyhDpe39nQ_E" | |
}, | |
"source": [ | |
"## 3.1 Define Model\n", | |
"\n", | |
"The final architecture for applying GNNs to the task of graph property prediction then looks as follows and allows for complete end-to-end training:" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 12, | |
"metadata": { | |
"id": "PAQqTDiwneC7" | |
}, | |
"outputs": [], | |
"source": [ | |
"# Embed node type, attribute, and depth\n", | |
"class NodeEncoder(torch.nn.Module):\n", | |
" def __init__(self, cfg):\n", | |
" super(NodeEncoder, self).__init__()\n", | |
" self.type_enc = nn.Embedding(cfg.num_nodetypes, cfg.emb_dim)\n", | |
" self.attr_enc = nn.Embedding(cfg.num_nodeattributes, cfg.emb_dim)\n", | |
" self.depth_enc = nn.Embedding(cfg.max_depth + 1, cfg.emb_dim)\n", | |
" self.cfg = cfg\n", | |
" self.apply(self._init_weights)\n", | |
"\n", | |
" def _init_weights(self, module):\n", | |
" for param in module.parameters():\n", | |
" param.data.uniform_(.0, 1.0)\n", | |
"\n", | |
" def forward(self, x, depth):\n", | |
" depth[depth > self.cfg.max_depth] = self.cfg.max_depth\n", | |
" # add the node feature embeddings\n", | |
" return self.type_enc(x[:,0]) + self.attr_enc(x[:,1]) + self.depth_enc(depth)\n", | |
"\n", | |
"# Apply GIN convolution given the graph structure\n", | |
"# message propagation is enabled by inheriting from MessagePassing\n", | |
"# use sum \"add\" aggregation\n", | |
"class GINConv(MessagePassing):\n", | |
" def __init__(self, emb_dim):\n", | |
" super(GINConv, self).__init__(aggr=\"add\")\n", | |
" self.mlp = nn.Sequential(\n", | |
" nn.Linear(emb_dim, 2*emb_dim), \n", | |
" nn.BatchNorm1d(2*emb_dim), \n", | |
" nn.ReLU(), \n", | |
" nn.Linear(2*emb_dim, emb_dim))\n", | |
" self.eps = nn.Parameter(torch.Tensor([0]))\n", | |
" self.edge_encoder = nn.Linear(2, emb_dim)\n", | |
" self.apply(self._init_weights)\n", | |
"\n", | |
" def _init_weights(self, module):\n", | |
" for param in module.parameters():\n", | |
" param.data.uniform_(.0, 1.0)\n", | |
"\n", | |
" def forward(self, x, edge_index, edge_attr):\n", | |
" emb = self.edge_encoder(edge_attr)\n", | |
" emb = self.propagate(edge_index, x = x, edge_emb = emb)\n", | |
" emb += x * (self.eps + 1)\n", | |
" res = self.mlp(emb)\n", | |
" return res\n", | |
"\n", | |
" def message(self, x_j, edge_emb):\n", | |
" return F.relu(x_j + edge_emb)\n", | |
"\n", | |
"# Node embedding GNN\n", | |
"# Encodes node attributes with provided NodeEncoder\n", | |
"# then applies Message Passing with GINConv\n", | |
"# Intra layers: Linear --> Batch Norm --> Dropout --> Activation\n", | |
"class NodeGIN(torch.nn.Module):\n", | |
" def __init__(self, cfg):\n", | |
" super(NodeGIN, self).__init__()\n", | |
" self.encoder = NodeEncoder(cfg)\n", | |
" self.convs = nn.ModuleList([GINConv(cfg.emb_dim) for l in range(cfg.num_layers)])\n", | |
" self.bns = nn.ModuleList([nn.BatchNorm1d(cfg.emb_dim) for l in range(cfg.num_layers)])\n", | |
" self.cfg = cfg\n", | |
" self.apply(self._init_weights)\n", | |
"\n", | |
" def _init_weights(self, module):\n", | |
" for param in module.parameters():\n", | |
" param.data.uniform_(.0, 1.0)\n", | |
"\n", | |
" def forward(self, data_batch):\n", | |
" x, edge_index, edge_attr, node_depth = data_batch.x, data_batch.edge_index, data_batch.edge_attr, data_batch.node_depth\n", | |
" h_last = self.encoder(x, node_depth.view(-1,))\n", | |
" for l in range(self.cfg.num_layers):\n", | |
" h = self.convs[l](h_last, edge_index, edge_attr)\n", | |
" h = self.bns[l](h)\n", | |
" h = F.relu(h) if l < (self.cfg.num_layers-1) else h\n", | |
" h = F.dropout(h, p = self.cfg.dropout, training = self.training)\n", | |
" h_last = h\n", | |
" return h_last\n", | |
"\n", | |
"# Wrapper GNN to generate final predictions: \n", | |
"# sequence of tokens that form method name for given graph\n", | |
"# Intra layers: Linear --> Batch Norm --> Dropout --> Activation --> Aggregation\n", | |
"# GIN aggregate with global_add_pool\n", | |
"class GIN(nn.Module):\n", | |
" def __init__(self, v2i, cfg):\n", | |
" super(GIN, self).__init__()\n", | |
" self.node_GIN = NodeGIN(cfg)\n", | |
" self.pred_lins = nn.ModuleList([nn.Linear(cfg.emb_dim, len(v2i)) for i in range(cfg.max_seq_len)])\n", | |
" self.v2i = v2i\n", | |
" self.cfg = cfg\n", | |
" self.apply(self._init_weights)\n", | |
" \n", | |
" def _init_weights(self, module):\n", | |
" for param in module.parameters():\n", | |
" param.data.uniform_(.0, 1.0)\n", | |
"\n", | |
" def forward(self, data_batch):\n", | |
" node_emb = self.node_GIN(data_batch)\n", | |
" graph_emb = global_add_pool(node_emb, data_batch.batch)\n", | |
" preds = [lin(graph_emb) for lin in self.pred_lins]\n", | |
" return preds\n", | |
" \n", | |
" def save(self):\n", | |
" print('')\n", | |
" print(f'Saving model: {self.cfg.model_save_path}')\n", | |
" summary = dict(params=self.cfg, v2i=dict(self.v2i), state=self.state_dict())\n", | |
" torch.save(summary, self.cfg.model_save_path)\n", | |
"\n", | |
" @staticmethod\n", | |
" def load(model_path_fname, use_cuda = True):\n", | |
" data = torch.load(model_path_fname)\n", | |
" cfg, v2i, state = data['params'], data['v2i'], data['state']\n", | |
" model = GIN(v2i, cfg)\n", | |
" model.load_state_dict(state)\n", | |
" if use_cuda: model.cuda()\n", | |
" model.eval()\n", | |
" return model" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "QNSBk2iKnXDq" | |
}, | |
"source": [ | |
"## 3.2 Create Model" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "H0_3qNYL6hfV" | |
}, | |
"source": [ | |
"Instantiate the model and load it on the GPU if available. Use [Adam optimizer](https://pytorch.org/docs/stable/generated/torch.optim.Adam.html) and [Cross-Entropy Loss](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html) for multi-class classificaiton." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 13, | |
"metadata": { | |
"id": "vcCnD9j9R3Vb" | |
}, | |
"outputs": [], | |
"source": [ | |
"model = GIN(v2i, cfg).to(cfg.device)\n", | |
"optimizer = torch.optim.Adam(model.parameters(), lr = cfg.lr)\n", | |
"ceLoss = torch.nn.CrossEntropyLoss()" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "MUhZo6hsVi4Q" | |
}, | |
"source": [ | |
"## 3.3 Create Evaluator\n", | |
"\n", | |
"Wrapper to evaluate on validation and test sets using the OGB provided evaluation metric. The `decode` function takes a sequence of predicted tokens and converts them back to words using the reverse vocabulary mapping `i2v`." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 14, | |
"metadata": { | |
"id": "0z12QwKXVohb" | |
}, | |
"outputs": [], | |
"source": [ | |
"class Eval():\n", | |
" def __init__(self, model, loaders, evaluator, i2v, PAD, cfg):\n", | |
" self.model = model\n", | |
" self.loaders = loaders\n", | |
" self.ogb_eval = evaluator.eval\n", | |
" self.i2v = i2v\n", | |
" self.filter_tokens = torch.as_tensor([PAD], dtype = torch.long, device = cfg.device)\n", | |
" self.cfg = cfg\n", | |
" \n", | |
" def decode(self, pred):\n", | |
" res = []\n", | |
" for p in pred:\n", | |
" mask = torch.isin(p, self.filter_tokens, invert = True)\n", | |
" p_trimmed = p[mask]\n", | |
" res.append([self.i2v[i] for i in p_trimmed.cpu().numpy()])\n", | |
" return res\n", | |
"\n", | |
" def eval(self):\n", | |
" loaders_metrics = []\n", | |
" for loader in self.loaders:\n", | |
" preds_acc = []\n", | |
" labels_acc = []\n", | |
" for batch_id, batch in enumerate(tqdm(loader, desc='Eval Batch')):\n", | |
" batch = batch.to(self.cfg.device)\n", | |
" # disable auto grad\n", | |
" with torch.no_grad():\n", | |
" pred = self.model(batch)\n", | |
" pred_max = torch.cat([torch.argmax(p, dim = 1).view(-1,1) for p in pred], dim = 1)\n", | |
" pred_max_decoded = self.decode(pred_max)\n", | |
" preds_acc.extend(pred_max_decoded)\n", | |
" labels_acc.extend([*batch.y])\n", | |
" metrics = self.ogb_eval({'seq_pred': preds_acc, 'seq_ref': labels_acc})\n", | |
" metrics['acc'] = Eval.compute_accuracy(preds_acc, labels_acc)\n", | |
" loaders_metrics.append(metrics)\n", | |
" return loaders_metrics\n", | |
"\n", | |
" @staticmethod\n", | |
" def compute_accuracy(seq_pred, seq_ref):\n", | |
" acc = []\n", | |
" for l, p in zip(seq_ref, seq_pred):\n", | |
" label = set(l)\n", | |
" prediction = set(p)\n", | |
" n = len(label)\n", | |
" true_positive = len(label.intersection(prediction))\n", | |
" false_positive = len(prediction - label)\n", | |
" false_negative = len(label - prediction)\n", | |
" true_negative = max(n - (true_positive + false_positive + false_negative), 0)\n", | |
" acc.append((true_positive + true_negative) / n)\n", | |
" return np.average(acc)\n", | |
"\n", | |
"model_evaluator = Eval(model, [valid_loader], evaluator, i2v, PAD, cfg)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "k4R6uvBBSeFc" | |
}, | |
"source": [ | |
"# 4. Training\n", | |
"\n", | |
"Finally let's train our network to see how well it performs on the training as well as test sets.\n", | |
"\n", | |
"_Note: Due to the size of the dataset (~450k graphs), the expected training time for each iteration is about 70 minutes on a standard GPU._" | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 15, | |
"metadata": { | |
"colab": { | |
"base_uri": "https://localhost:8080/" | |
}, | |
"id": "GEwQQONSSg6t", | |
"outputId": "25236f72-fbc4-4e08-a8bc-2f1fe4d69474" | |
}, | |
"outputs": [ | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [07:23<00:00, 7.18it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.05it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 01, Loss: 25.8113, Accuracy: 0.0128, F1: 0.0176 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:23<00:00, 8.32it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:15<00:00, 11.58it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 02, Loss: 3.8183, Accuracy: 0.0346, F1: 0.0478 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:23<00:00, 8.32it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.09it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 03, Loss: 3.7375, Accuracy: 0.0306, F1: 0.0422 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:20<00:00, 8.38it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:15<00:00, 11.85it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 04, Loss: 3.7877, Accuracy: 0.0326, F1: 0.0431 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:24<00:00, 8.30it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.23it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 05, Loss: 3.8740, Accuracy: 0.0300, F1: 0.0400 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:17<00:00, 8.44it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.09it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 06, Loss: 3.7172, Accuracy: 0.0369, F1: 0.0499 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:17<00:00, 8.45it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.13it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 07, Loss: 3.6980, Accuracy: 0.0259, F1: 0.0345 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:19<00:00, 8.39it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.37it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 08, Loss: 3.5636, Accuracy: 0.0347, F1: 0.0458 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:15<00:00, 8.49it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.43it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 09, Loss: 3.4992, Accuracy: 0.0225, F1: 0.0289 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:17<00:00, 8.45it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.36it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 10, Loss: 3.6468, Accuracy: 0.0338, F1: 0.0420 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:15<00:00, 8.50it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.25it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 11, Loss: 3.4158, Accuracy: 0.0428, F1: 0.0488 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:19<00:00, 8.41it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.38it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 12, Loss: 3.5234, Accuracy: 0.0438, F1: 0.0560 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:19<00:00, 8.39it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:15<00:00, 11.81it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 13, Loss: 3.4410, Accuracy: 0.0471, F1: 0.0596 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:22<00:00, 8.33it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.12it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 14, Loss: 3.3213, Accuracy: 0.0478, F1: 0.0569 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:20<00:00, 8.37it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:15<00:00, 11.68it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 15, Loss: 3.3446, Accuracy: 0.0487, F1: 0.0619 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:15<00:00, 8.48it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:15<00:00, 11.85it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 16, Loss: 3.4295, Accuracy: 0.0461, F1: 0.0562 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:18<00:00, 8.42it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.28it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 17, Loss: 3.2419, Accuracy: 0.0523, F1: 0.0603 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:20<00:00, 8.37it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:15<00:00, 11.65it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 18, Loss: 3.2632, Accuracy: 0.0540, F1: 0.0641 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:22<00:00, 8.33it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.11it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 19, Loss: 3.3026, Accuracy: 0.0603, F1: 0.0707 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:21<00:00, 8.36it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.19it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 20, Loss: 3.2678, Accuracy: 0.0477, F1: 0.0597 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:20<00:00, 8.39it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.20it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 21, Loss: 3.1666, Accuracy: 0.0586, F1: 0.0711 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:21<00:00, 8.35it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:15<00:00, 11.82it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 22, Loss: 3.1788, Accuracy: 0.0558, F1: 0.0669 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:20<00:00, 8.38it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.45it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 23, Loss: 3.1845, Accuracy: 0.0515, F1: 0.0625 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:17<00:00, 8.44it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.37it/s]\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Saving model: gin_128bz_256emb_0.005lr_25it.net\n", | |
"\n", | |
"It: 24, Loss: 3.1288, Accuracy: 0.0703, F1: 0.0822 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"Training Batch: 100%|██████████| 3188/3188 [06:16<00:00, 8.46it/s]\n", | |
"Eval Batch: 100%|██████████| 179/179 [00:14<00:00, 12.37it/s]" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"It: 25, Loss: 3.0778, Accuracy: 0.0602, F1: 0.0735 \n", | |
"------------------------------------------------------------\n" | |
] | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stderr", | |
"text": [ | |
"\n" | |
] | |
} | |
], | |
"source": [ | |
"# store metrics for each iteration\n", | |
"history = { 'train_loss': [], 'val_metrics': [], 'val_acc': [], 'val_f1': [] }\n", | |
"\n", | |
"for it in range(cfg.max_iter):\n", | |
" # place model in training mode\n", | |
" model.train()\n", | |
" train_loss = .0\n", | |
" # iterate through batches defined in the dataloader\n", | |
" for batch_id, batch in enumerate(tqdm(train_loader, desc = 'Training Batch')):\n", | |
" optimizer.zero_grad()\n", | |
" batch = batch.to(cfg.device)\n", | |
" pred = model(batch)\n", | |
" curr_loss = .0\n", | |
" for i in range(len(pred)):\n", | |
" curr_loss += ceLoss(pred[i].to(torch.float32), batch.y_tensor[:,i])\n", | |
" curr_loss = curr_loss / len(pred)\n", | |
" \n", | |
" curr_loss.backward()\n", | |
" optimizer.step()\n", | |
" train_loss += curr_loss.item()\n", | |
" avg_loss = train_loss / (batch_id + 1)\n", | |
" history['train_loss'].append(avg_loss)\n", | |
"\n", | |
" # place model in evaluation mode (ignore gradient tracking)\n", | |
" model.eval()\n", | |
" \n", | |
" # evaluate model based on validation and test sets\n", | |
" val_metrics = model_evaluator.eval()[0] # dataset.eval_metric\n", | |
" history['val_metrics'].append(val_metrics)\n", | |
" history['val_acc'].append(val_metrics['acc'])\n", | |
" history['val_f1'].append(val_metrics['F1'])\n", | |
"\n", | |
" # determine best model based on validation accuracy\n", | |
" best_it = np.argmax(np.array(history['val_f1']))\n", | |
" if best_it == it:\n", | |
" model.save()\n", | |
"\n", | |
" print('')\n", | |
" print(f'It: {(it+1):02d}, '\n", | |
" f'Loss: {avg_loss:.4f}, '\n", | |
" f'Accuracy: {val_metrics[\"acc\"]:.4f}, '\n", | |
" f'F1: {val_metrics[\"F1\"]:.4f} ')\n", | |
" print('-'*60)\n", | |
"\n", | |
" with open(cfg.history_write_path, 'w') as f:\n", | |
" json.dump(history, f)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "SAnBazBFWx0K" | |
}, | |
"source": [ | |
"# 5. Results\n", | |
"\n", | |
"Plot the training loss with validation/test accuracy to assess model performance and fine-tune hyperparameters." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 24, | |
"metadata": { | |
"id": "WMEhrYG5Xs8C", | |
"colab": { | |
"base_uri": "https://localhost:8080/", | |
"height": 580 | |
}, | |
"outputId": "4ad6ac17-87d6-40db-9280-4ac5d74119ed" | |
}, | |
"outputs": [ | |
{ | |
"output_type": "display_data", | |
"data": { | |
"text/plain": [ | |
"<Figure size 432x288 with 1 Axes>" | |
], | |
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXgAAAEICAYAAABVv+9nAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAAsTAAALEwEAmpwYAAApAUlEQVR4nO3deXwddb3/8dcnyclykjRJ0zR03ymFttQSSmkFiiCi4EWvVEHgFhRFREBU4P744ZV7ryheF7zCBcQfqCyXxYsWvSAKWGyhLE3L2o0upG26Zmv2Pd/fHzNJT7OnTXI4c97Px+M8MufMnJnvZE7eZ+Y7M5+Ycw4REQmehGg3QEREhoYCXkQkoBTwIiIBpYAXEQkoBbyISEAp4EVEAkoBHyPM7M9mtmywp40WMzvNzDZHuQ23mNn/i2YbosHMLjezl6Pdjs7M7CUzuzLa7QgSBfwQMrOaiEebmdVHPL9kIPNyzn3SOffbwZ52IMxsoZk9b2blZlZiZr8zszER428zs+ZO6z21hzaucs7NjHhvkZmdPdhtjpj/EjMr7tSGHzjnFCgxxsxGm9ljZrbHzCrN7BUzOyXa7fowUsAPIedcRvsD2Al8OuK1R9unM7Ok6LVyQHKA+4HJwCSgGvh1p2meiFxv59z2oW6UefRZjh8ZwBrgJGAk8FvgGTPLiGqrPoT0RxEF7XuTZnazme0Dfm1mOWb2v/6ecYU/PD7iPR2Hr+2H2Gb2E3/aD8zsk0c47RQzW2lm1Wb2gpn9l5k90l27nXN/ds79zjlX5ZyrA+4GFh/N78AffhiYCPzJ3+u/yX99oZmtNrODZva2mS3ptI63m9krQB0w1cyuMLON/rpsN7Or/GnTgT8DYyOOLMb6RxyPRMzzH8xsvb+8l8xsVsS4IjP7jpm94+81PmFmqf64Uf72Ougf3azq6QvHzBaZ2Rp/HmvMbFGndfp3f4+02sz+amajevkdfsXMtvrL/KOZjY0Yd46ZbfaXc4+Z/d0O7/4wM7vbH7/JzM6KGNHvz4Q//Ugz+7V5e9QVZra8n238uL/sSjO7G7BO8/2Svz0rzOwvZjYJwDm33Tn3M+fcXudcq3PufiAZmIkczjmnxzA8gCLgbH94CdAC/AhIAdKAXOBzQBjIBH4HLI94/0vAlf7w5UAz8BUgEbga2APYEUz7KvATvD+QjwJVwCP9XKdvAq9FPL8NqATKgfXA1b28dwlQ3N3vx38+DigDPoW3I/Jx/3lexDruBE4AkoAQcB4wDS8ozsAL/vndLS+ivY/4w8cCtf5yQsBNwFYgOaJ9bwBj8fYaNwJf88f9ELjPf18IOK3999tpeSOBCuAyv80X+89zI9Zpm9+WNP/5HT38/j4GlALz8T5DdwEr/XGj/O34j/5yrvc/A5GfiRbgBr+9X/C328gj+UwAzwBP4B3hhYAz+tnGauBC/z03+G1qb+MF/u9/lr8OtwKre1j+PKAByIr23/mH7RH1BsTLg64B3wSk9jL9PKAi4vlLnf5At0aMCwMOOGYg0+LtNbcA4Yjxj/T2xxwx3Vy8ID8t4rXj8QIwEVgE7AUu7uH9S+g94G8GHu70nr8AyyLW8d/6aONy4Prulue/dhuHAv67wJMR4xKA3cCSiPZdGjH+P4D7/OF/A54GpvfRnsuANzq99ipwecQ63Rox7uvAcz3M6wHgPyKeZ+CF+GTgn4BXI8YZsKvTZ6LjS95/7Q2/fQP6TABjgDYg5wja+FqnNhZHtPHPwJc7bY86YFKnZYwA3gX+z2D+vQbloS6a6ClxzjW0PzGzsJn90sx2mFkVsBLINrPEHt6/r33Aed0l4P0BDWTasUB5xGvgBUGvzGw63h/g9c65VRHz3uCc2+O8w+bVwH/i7aEdiUnAUr/b46CZHcTbmxwTMc1hbTWzT5rZa353wEG8vf8euzg6GQvsiFiXNn/+4yKm2RcxXMeh3/eP8fY2/+p3Df1zf5bh29HPZfTV3hq8I5xx/rhdEeMcXnhG2u2/HtmOsfTxmTCz+yK6uW4BJvjTVwxCGyO35yTgPyO2fTnel0DH78rM0oA/4X1R/LCb5cc9BXz0dC7j+W28PsRTnHMjgNP9142hsxcYaWbhiNcm9PYGvx/0BeDfnXMP9zF/R//b3/n3sQtvDz474pHunLuju/eYWQrwFF7XQr5zLht4NmL5fZVN3YMXKu3zM7zfxe4+G+5ctXPu2865qcA/AN+K7NPuaRm+if1ZRj/am47Xzbcbb7tGnr+xyOe+cf7rke3YQx+fCefc19yhE+g/wNtOI80s+wjaOCFinHH4Z28XcFWn7Z/m7zi0b+/leF9cV3WzbEEB/2GSCdQDB81sJPC9oV6gc24HUAjcZmbJZnYq8OmepjezccDfgLudc/d1M/4C804Wm5ktAK7D67roj/1A5CWVjwCfNrNPmFmimaWad2K2c1C1S8br5y0BWsw7kXxOp/nnmllWD+9/EjjPzM4ysxDeF24jsLqvhpvZ+WY23Q+pSqAVr9uis2eBY83si2aWZGZfwOvW+t++ltGNx4ArzGyeH3Y/AF53zhXh9YnPMbPPmHeF1jV4XXKRRgPXmVnIzJbi9XU/O9DPhHNuL97R3D3+tg+ZWfvOSV9tPMHM/tFv43Wd2ngf8H/M7AQAM8vy24m/ff4H7+9lmX+0Jd1QwH94/BzvxFop8Brw3DAt9xLgVLxD5+/jnSxr7GHaK/FC+LaIw/SaiPEX4XVVVAMPAT9y/b8e/4fArf4h+Xecc7vwTrTdghfau4Ab6eEz65yrxguJJ/FOXH4R+GPE+E14gbPdX8bYTu/fDFyKdyKwFC/UPu2ca+pH22fgHdXU4PWp3+OcW9FNG8uA8/G+PMrwTuSe75wr7ccyOs/rBbzzBk/h7Q1Pw/v9489vKd55gjK8L5FCDt+ur/vtLgVuBy702wcD+0yA13ffDGwCDuCdfO9vG+/wlzMDeCVi/f6AdxHC436X5XtA+9Vfi/B+j+fg7RC1fxZP6+v3Fm/ar6QQAcDMngA2OeeG/AhChod5l2wWA5d098XTj/frMxGjtAcf58zsZDObZmYJZnYu3l7z8ig3S46S37WV7XeN3IJ3LuK1fr5Xn4mAiJU7KGXoHAP8Hu/kVzHetetvRrdJMghOBf4b79zEBuAzzrn6fr5Xn4mAUBeNiEhAqYtGRCSgotZFM2rUKDd58uRoLV5EJCatXbu21DmX159poxbwkydPprCwMFqLFxGJSWbW+W7oHqmLRkQkoBTwIiIBpYAXEQkoXQcvMgDNzc0UFxfT0NDQ98QiRyE1NZXx48cTCoWOeB4KeJEBKC4uJjMzk8mTJ3N4MUaRweOco6ysjOLiYqZMmXLE81EXjcgANDQ0kJubq3CXIWVm5ObmHvWRogJeZIAU7jIcBuNzFnMBv3lfNT/5y2YqavtTxVVEJH7FXMB/UFrL3Su2sqeyv3WTRIKjrKyMefPmMW/ePI455hjGjRvX8bypqfednsLCQq677ro+l7Fo0aJBaetLL73E+eefPyjz6s2bb77Jl7/8ZQB+85vf8I1vfGPIl9kfRUVFzJ49u9dpSkpKOPfcc4esDTF3kjU77J1RrqxrjnJLRIZfbm4ub731FgC33XYbGRkZfOc73+kY39LSQlJS93/WBQUFFBQU9LmM1av7/CdWHyo/+MEPuPXWW6PdjCOSl5fHmDFjeOWVV1i8ePGgzz/m9uDbA75CAS8CwOWXX87XvvY1TjnlFG666SbeeOMNTj31VD7ykY+waNEiNm/eDBy+R33bbbfxpS99iSVLljB16lR+8YtfdMwvIyOjY/olS5Zw4YUXctxxx3HJJZfQXn322Wef5bjjjuOkk07iuuuuG9Ce+mOPPcacOXOYPXs2N998MwCtra1cfvnlzJ49mzlz5nDnnXcC8Itf/ILjjz+euXPnctFFF3WZV3V1Ne+88w4nnnhil3FFRUV87GMfY+7cuZx11lns3LkTgG3btrFw4ULmzJnDrbfe2rG+kWpraznvvPM48cQTmT17Nk888QQAa9asYdGiRZx44oksWLCA6upqioqKOO2005g/fz7z58/v9guytbWVG2+8kZNPPpm5c+fyy1/+smPcZz7zGR599NF+//4GIub24HPCyQAcrFcfvETXv/5pPRv2VA3qPI8fO4LvffqEAb+vuLiY1atXk5iYSFVVFatWrSIpKYkXXniBW265haeeeqrLezZt2sSKFSuorq5m5syZXH311V2uuX7zzTdZv349Y8eOZfHixbzyyisUFBRw1VVXsXLlSqZMmcLFF1/c73bu2bOHm2++mbVr15KTk8M555zD8uXLmTBhArt37+a9994D4ODBgwDccccdfPDBB6SkpHS8FqmwsLDHbpBrr72WZcuWsWzZMh588EGuu+46li9fzvXXX8/111/PxRdfzH33dfnXwgA899xzjB07lmeeeQaAyspKmpqa+MIXvsATTzzBySefTFVVFWlpaYwePZrnn3+e1NRUtmzZwsUXX9ylztYDDzxAVlYWa9asobGxkcWLF3POOecwZcoUCgoKhuwIJOb24LPSvA/gQe3Bi3RYunQpiYmJgBdGS5cuZfbs2dxwww2sX7++2/ecd955pKSkMGrUKEaPHs3+/fu7TLNgwQLGjx9PQkIC8+bNo6ioiE2bNjF16tSO67MHEvBr1qxhyZIl5OXlkZSUxCWXXMLKlSuZOnUq27dv59prr+W5555jxIgRAMydO5dLLrmERx55pNuup71795KX131hxVdffZUvfvGLAFx22WW8/PLLHa8vXboUoGN8Z3PmzOH555/n5ptvZtWqVWRlZbF582bGjBnDySefDMCIESNISkqiubmZr3zlK8yZM4elS5eyYcOGLvP761//ykMPPcS8efM45ZRTKCsrY8uWLQCMHj2aPXv29Pt3OBAxtwefGkokNZRAZb0CXqLrSPa0h0p6enrH8He/+13OPPNM/vCHP1BUVMSSJUu6fU9KSkrHcGJiIi0tLUc0zWDIycnh7bff5i9/+Qv33XcfTz75JA8++CDPPPMMK1eu5E9/+hO3334777777mFBn5aWNiR3FR977LGsW7eOZ599lltvvZWzzjqLz372s91Oe+edd5Kfn8/bb79NW1sbqampXaZxznHXXXfxiU98osu4hoYG0tLSBn0dIAb34AGy05J1maRIDyorKxk3bhzgXVUy2GbOnMn27dspKioC6Oif7o8FCxbw97//ndLSUlpbW3nsscc444wzKC0tpa2tjc997nN8//vfZ926dbS1tbFr1y7OPPNMfvSjH1FZWUlNTc1h85s1axZbt27tdlmLFi3i8ccfB+DRRx/ltNNOA2DhwoUdXVbt4zvbs2cP4XCYSy+9lBtvvJF169Yxc+ZM9u7dy5o1awCv/7+lpYXKykrGjBlDQkICDz/8MK2trV3m94lPfIJ7772X5mZvx/T999+ntra2Y7ivq22OVMztwYN3ovWg9uBFunXTTTexbNkyvv/973PeeecN+vzT0tK45557OPfcc0lPT+/osujOiy++yPjx4zue/+53v+OOO+7gzDPPxDnHeeedxwUXXMDbb7/NFVdcQVtbGwA//OEPaW1t5dJLL6WyshLnHNdddx3Z2dmHzf+4446jsrKS6upqMjMzDxt31113ccUVV/DjH/+YvLw8fv3rXwPw85//nEsvvZTbb7+dc889l6ysrC7tfvfdd7nxxhtJSEggFApx7733kpyczBNPPMG1115LfX09aWlpvPDCC3z961/nc5/7HA899FDH76SzK6+8kqKiIubPn49zjry8PJYvXw7AihUrhmQ7QRT/J2tBQYE70n/4cdH9r9LWBk9+7dRBbpVI7zZu3MisWbOi3Yyoq6mpISMjA+cc11xzDTNmzOCGG26ISlvuvPNOMjMzufLKK/s1fV1dHWlpaZgZjz/+OI899hhPP/30ELeyZ6effjpPP/00OTk5XcZ193kzs7XOub6vdyWGu2h0FY1I9PzqV79i3rx5nHDCCVRWVnLVVVdFrS1XX331YecK+rJ27VrmzZvH3Llzueeee/jpT386hK3rXUlJCd/61re6DffBEJN78P/81Du8uOkAa/7v2YPcKpHeaQ9ehlN87sGHk6msayZaX04S3/S5k+EwGJ+zGA34EE2tbdQ3dz1bLTKUUlNTKSsrU8jLkGqvB9/dJZcDEZtX0aQdKlcQTo7JVZAYNX78eIqLiykpKYl2UyTg2v+j09GIyXTMbi9XUNfEuOyhuUFApDuhUOio/sOOyHCK2S4aUEVJEZHexHTA62YnEZGexWbAp3ldNBV1uhZeRKQnsRnwYVWUFBHpS0wGvCpKioj0rc+AN7NUM3vDzN42s/Vm9q/dTJNiZk+Y2VYze93MJg9JayNkpyVzUF00IiI96s8efCPwMefcicA84FwzW9hpmi8DFc656cCdwI8GtZXdyA6H9G/7RER60WfAO097EeaQ/+h8G98FwG/94f8BzjIzG7RWdiM7HNJlkiIivehXH7yZJZrZW8AB4Hnn3OudJhkH7AJwzrUAlUDuILazC1WUFBHpXb8C3jnX6pybB4wHFpjZEf37ETP7qpkVmlnh0d7qrS4aEZHeDegqGufcQWAFcG6nUbuBCQBmlgRkAWXdvP9+51yBc66gp3+U21+qKCki0rv+XEWTZ2bZ/nAa8HFgU6fJ/ggs84cvBP7mhjh5VVFSRKR3/Sk2Ngb4rZkl4n0hPOmc+18z+zeg0Dn3R+AB4GEz2wqUAxcNWYt97RUlD6qipIhIt/pMRufcO8BHunn9XyKGG4Clg9u03rXfzVpR18RYVZQUEekiJu9khUMlg3WppIhI92I44FVRUkSkN7Eb8Gnt//RDAS8i0p3YDfiIPngREekqZgNeFSVFRHoXswEPqigpItKb2A54lSsQEelRzAe8LpMUEelebAe8KkqKiPQotgM+HNJlkiIiPYjpgM/yA14VJUVEuorpgM8JJ6uipIhID2I64CMrSoqIyOFiO+B1N6uISI9iPOBVUVJEpCcxHvCqKCki0pPYDnhVlBQR6VFsB7z64EVEehTTAa+KkiIiPYvpgAdVlBQR6UnsB7zKFYiIdCvmAz4rTQEvItKdmA/4nLAqSoqIdCfmA15dNCIi3Yv5gFdFSRGR7sV8wKuipIhI92I+4FVRUkSke7Ef8GEFvIhId2I+4LM66tHoShoRkUgxH/A56aooKSLSnZgPeFWUFBHpXuwHfEdNeHXRiIhEivmAb68oqT14EZHDxXzAgypKioh0JxgBr3IFIiJdBCLgVVFSRKSrQAS8KkqKiHQViIBXF42ISFd9BryZTTCzFWa2wczWm9n13UyzxMwqzewt//EvQ9Pc7mWFQxysV0VJEZFISf2YpgX4tnNunZllAmvN7Hnn3IZO061yzp0/+E3sW3ZaMk0tXkXJcHJ/VklEJPj63IN3zu11zq3zh6uBjcC4oW7YQOSo4JiISBcD6oM3s8nAR4DXuxl9qpm9bWZ/NrMTenj/V82s0MwKS0pKBt7aHqiipIhIV/0OeDPLAJ4Cvumcq+o0eh0wyTl3InAXsLy7eTjn7nfOFTjnCvLy8o6wyV11VJTUlTQiIh36FfBmFsIL90edc7/vPN45V+Wcq/GHnwVCZjZqUFvai46KktqDFxHp0J+raAx4ANjonPtZD9Mc40+HmS3w51s2mA3tjSpKioh01Z9LThYDlwHvmtlb/mu3ABMBnHP3ARcCV5tZC1APXOSG8ZpFVZQUEemqz4B3zr0MWB/T3A3cPViNGqjUUCIpSaooKSISKRB3soJfrkAVJUVEOgQm4FWuQETkcIEJ+Ky0kP4vq4hIhMAEvLcHry4aEZF2gQl4rw9ee/AiIu0CE/CqKCkicrjABHx7RcmG5rZoN0VE5EMhMAHfXlGyQv3wIiJAgAJeFSVFRA4XmIBXRUkRkcMFJuC1By8icrjABHxOWBUlRUQiBSbgVVFSRORwgQn49oqSldqDFxEBAhTw4HXT6DJJERFPoAJeFSVFRA4JVMCroqSIyCGBCnhVlBQROSRQAa+KkiIihwQq4FVRUkTkkEAFvCpKiogcEqyAV0VJEZEOgQr4HNWjERHpEKiAV0VJEZFDAhXw7V00KlcgIhKwgG+vKFmhgBcRCVbAq6KkiMghgQp4VZQUETkkUAEP3l68LpMUEQlgwKtcgYiIJ3ABr4qSIiKewAV8djikPngREYIY8Gn6r04iIhDEgE9XRUkREQhiwKuipIgIEMSA181OIiJAAAO+vaJkRa1OtIpIfOsz4M1sgpmtMLMNZrbezK7vZhozs1+Y2VYze8fM5g9Nc/umipIiIp6kfkzTAnzbObfOzDKBtWb2vHNuQ8Q0nwRm+I9TgHv9n8NOFSVFRDx97sE75/Y659b5w9XARmBcp8kuAB5ynteAbDMbM+it7YdD/9VJAS8i8W1AffBmNhn4CPB6p1HjgF0Rz4vp+iUwLNpLBquLRkTiXb8D3swygKeAbzrnqo5kYWb2VTMrNLPCkpKSI5lFn1RRUkTE06+AN7MQXrg/6pz7fTeT7AYmRDwf7792GOfc/c65AudcQV5e3pG0t1+ywyEVHBORuNefq2gMeADY6Jz7WQ+T/RH4J/9qmoVApXNu7yC2c0BywipXICLSn6toFgOXAe+a2Vv+a7cAEwGcc/cBzwKfArYCdcAVg97SAVBFSRGRfgS8c+5lwPqYxgHXDFajjlZ2OERRaV20myEiElWBu5MVvHo0uopGROJdMAM+PURFnSpKikh8C2bAq6KkiEhAA14VJUVEAhrwaaooKSISzIBXuQIRkaAGvCpKiogEOuB1s5OIxLNABnx7RUmVKxCReBbIgFdFSRGRgAY8qKKkiEhwA17lCkQkzgU34MMh/ds+EYlrgQ549cGLSDwLbsCri0ZE4lxwAz6sipIiEt8CHPCqKCki8S3AAa+KkiIS34Ib8H5FSV0LLyLxKrgBr3IFIhLnAhzwqigpIvEt8AGvipIiEq+CG/Bp6qIRkfgW2IBPS1ZFSRGJb4ENeFBFSRGJb8EOeJUrEJE4FuyAV0VJEYljgQ949cGLSLwKdsCri0ZE4liwA14nWUUkjgU84JNpbGmjvqk12k0RERl2AQ94VZQUkfgV7IBXRUkRiWOBDvgsfw9e5QpEJB4FOuBz/JLBulRSROJRoANeFSVFJJ4FO+D9ipLqgxeReBTogG+vKHlQffAiEof6DHgze9DMDpjZez2MX2JmlWb2lv/4l8Fv5pHTzU4iEq+S+jHNb4C7gYd6mWaVc+78QWnRIFO5AhGJV33uwTvnVgLlw9CWIZGlPXgRiVOD1Qd/qpm9bWZ/NrMTeprIzL5qZoVmVlhSUjJIi+5djgJeROLUYAT8OmCSc+5E4C5geU8TOufud84VOOcK8vLyBmHRfVMXjYjEq6MOeOdclXOuxh9+FgiZ2aijbtkg0UlWEYlXRx3wZnaMmZk/vMCfZ9nRznewZIVDqigpInGpz6tozOwxYAkwysyKge8BIQDn3H3AhcDVZtYC1AMXOefckLV4gNrLFRysbyItOS3KrRERGT59Brxz7uI+xt+Ndxnlh1JkRckxWQp4EYkfgb6TFQ5VlFQ/vIjEm8AHfEcXjcoViEicCXzAq6KkiMSr4Ae8KkqKSJwKfMCnhhJITkrQzU4iEncCH/Bm5pUrqNUevIjEl/5Uk4x5H8ZyBW1tjt0H63EOJuaGo90cEQmguAj4aFaUbGtzFFfUs+VANVsO1PD+/mq2Hqhh64Ea6vy7a6fmpXP2rHzOnpXP/InZJCUG/sBKRIZBXAR8TjjEqi2lXPPoOqblpTNtdAbT8jKYmpdOOHlwfgUNza3srWxgy34vyLceqGHLAS/MG5rbOqbLH5HCsfmZfOHkCcwYnUlTSysvbjrAr1/5gPtXbicnHOLMmaM5+/h8TpsxiszU0KC0T0TiT1wE/GULJ9Pa5li/p5I/v7eXtohCCuOy05ial860vAw/+NOZnpdBXmYKANWNLRyoauRAdQMl1Y0dwwc6DVc3tBy2zLFZqUzPz+SSU3I5Nj+D6aMzmT46g6y0roF9+eIpVDc0s/L9Ul7cuJ+/bT7A79/cTSjRWDg1l48fn89Zs/IZl607cUWk/yxaZWMKCgpcYWHhsC+3saWVHWV1bPP3sreV1LCtpJZtJYe6TAAyUpJoaWs7bO+7XUpSAqNHpDA6M5XRmSneY0Qq+SNSvS+I0RlHtefd0trGup0HeWHjfl7YuJ/tJbUAzBozgrNnjWb66AxSkhJJS04kNSmBtORE0kKJpHY8EkgLJaqrRySAzGytc66gX9PGW8D3xDnHvqoGL/QP1PBBaS2hxE5BPiKFvMxURqQm4RfQHBbbS2p4ceMBnt+4n8Ki8sOOQHoTSjRSkxLJSE1i/qQcTp8xio/OyNORgEgMU8AHWFVDM2U1TdQ3tVLf3Epjs/ezvrmV+qZWGlraaPDHNfivl9c28eq2Mg5UNwLeSd3Tpo/itBl5LJyWS0bK8PbU1Ta2sGpLCW/uOsjZs/I5efLIYV2+SCxTwEsXzjm2HKhh5fslvLy1lNe2l9HQ3EZSgjF/Yg4fnTGK02aMYu74bBITBv/oZM/Bel7cuJ8XNh7g1W1lNLUe6vpaMGUk3zhzOqfNGDWsR0YisUgBL31qbGll7Y4KVm0p5eUtpby3pxLnYERqEounj2LBlJFMzk1nwsgw43PSSA0lDmj+zjne213F8xv38+LG/azfUwXA5NwwZ8/yThrPHjeC3xUWc//K7eyramDu+CyuOXM6H5+VT8IQfMmIBIECXgasvLaJV7aWsmpLCau2lLK3sqFjnBkcMyKVCSPDTIx4TBgZZlJumNz0ZMyMhuZWVm8r5YWNB3hx4372VzWSYHDSpBzO8q/zn5aX3mUvvbGlld+v2829L21jZ3kdx+Zn8PUl0zl/7pijOlG8v6qBdTsqSEgwZuZnMmFkeEiOTkSGkwJejopzjpKaRnaV17GzvI6dZfXsLK/reL6vquGw6cPJiYzPSWNXeT31za2kJydy+rF5nDUrnzNn5pGbkdKv5ba0tvHMu3v5rxVbeX9/DZNyw3ztjGn84/xxpCT1fgTR0trGpn3VrNtZwdodFRQWVbD7YP1h06SGEjg2P5Nj8zOZmZ/JzGO8x+jMFHUNScxQwMuQamhupbiiPfzr2Flez66KOo4ZkcrZx+ezcOrIPgO5N21tjuc37ue/VmzlneJKjhmRyldOn8rFCyZ03JhWWd/Mm36Yr91RwVu7DnZc5po/IoWCSSOZPymH+ROzMTPe31fN5v3VbPZ/lvgnnMErKR0Z+scdk8nscVkD7pYSGQ4KeAkE5xwvby3l7r9t5fUPyhmZnswZx+axfk8l7++vASAxwZg1JpOTJuYwf1IOBZNHMjYrtc898vLaJjbvq+b9/dVs8n++v6+a6kbvhrXkpAQKJuWwePooTp2Wy9xxWcN6X0Fbm2NHeR0b91Z1PLaX1nLKlFwuXTiRE8ZmDVtb5MNFAS+BU1hUzt0rtvJucSVzxmdx0sQcTpqcw4njs0kfpMs8nXPsqWxgw54qXttexitbS9m0rxrwbnw7ZcpIFk0fxaJpuczMzxy0E8E1jS1s3lfFhr3VHWG+eV91xxFJYoIxdVQ643PSeNW/+mn+xGwuXTiJT80ZoyONOKOAFxkkZTWNvLq9jNXbyli9tZSisjoActOTWTgtl8XTvMCflBvGzGhsaaW2sZXaxhZqGluobWyh2v/pvdbaMbyzvI4Ne6vY4c8TvKuYZo0ZwawxIzje/zkjP6MjxCvrmnlqXTGPvLaD7aW15IRDfL5gApecMklVSeOEAl5kiOw+WM/qraW8uq2MV7aVsr/K68vPSEmisaWV5tb+/T0lJyUwNiu1I8xnjRnB8WNH9Kt7CbyjjdXbynjktR38dcN+2pzj9Bl5XLZwEmceN3pIrxZqbPFuniutbqK0tpHymibSkhPJCSczMj2ZnPQQOeFkQiqVMSQU8CLDwDnH9tJaVm8rY+v+atKSk8hMTSI9OZH0lCQyUpJI9x/ecGLHa4MZfvsqG3h8zU4ee2Mn+6saGZedxhdPmcjnCyZ0FM3rTmubo7aphbrGVmqb2o8yWqmsb6K0ponSmkbK/J/twyU1XQvr9SQzNYmR6clkh5MZGQ6Rk57MyHAyOenJjM5MYd6EbKblZeiehwFSwIvEoebWNl7cuJ+HX9vBK1vLOqqRAtQ1eV1D7T9rm1q6LaTXWXY4RG56MqMyUhiVmcIofzg3I4VRGcnkZqQwMj2ZhuZWKuqaqKhtpryuiYraJsprm6ioO/SzoraZ8tom6psPFfXLSgtx0qQcTpqUQ8GkHE6ckK1zCn0YSMDHRblgkXgQSkzg3NljOHf2GLaV1PDoaztZva2U1FAi6SmJ5KaHSU9JIuwfYaQne0cV4U4/R6SGyMtMISecTHLS4HezeJfZ1nv3LBRVULijnL9tOuCvg3HC2CwKJuVQMDmHkyaN7PUoRHqnPXgRibry2ibW7aigcEcFa3eU83ZxJU0t3hHGpNwwJ03KYeLIME0tbd6jta1juLG1jcbm9tdaDxvf0uZo7fRoc67L6+2vJZoxJjuVCTlh7zEyzS/X4Q3nZUT/pjjtwYtITBmZnszZx+dz9vH5gHci973dVazdUU5hUQV/31xCWW0TiQlGSlICyUkJJCf6P/3hlFAiKYkJhJOTyPZfS0o0EhP8h0UM9/BaS5tjz8F6dpXX8bfNBw67IQ68u6HH54SZkOMFf/uXwNjsNMZlpzHSL9vxYaGAF5EPnZSkxI6++a+e7p3QbnMMey2h+ibvru1dFXXsKveCv324cEdFlxPOaaFExmanMi4nzLjsNMbnpHnPs8OMy0kjPzNlWG+YU8CLyIeemZEYhR3jtOREZuRnMiM/s9vxlXXN7KqoY/fBenZX1LP7YD17Dno/1++upKy26bDpExOMY0akcvmiyXzl9KlD3n4FvIjIEcoKh8gKZzF7XPelI+qbWr3wbw9+/0tg9IjhOXGsgBcRGSJpyYlMH53B9NEZUVm+bjUTEQkoBbyISEAp4EVEAkoBLyISUAp4EZGAUsCLiASUAl5EJKAU8CIiARW1apJmVgLsOMK3jwJKB7E5sSae1z+e1x3ie/217p5Jzrm8/rwpagF/NMyssL/lMoMontc/ntcd4nv9te4DX3d10YiIBJQCXkQkoGI14O+PdgOiLJ7XP57XHeJ7/bXuAxSTffAiItK3WN2DFxGRPijgRUQCKuYC3szONbPNZrbVzP452u0ZTmZWZGbvmtlbZlYY7fYMNTN70MwOmNl7Ea+NNLPnzWyL/zMnmm0cKj2s+21mttvf/m+Z2aei2cahYmYTzGyFmW0ws/Vmdr3/erxs+57Wf8DbP6b64M0sEXgf+DhQDKwBLnbObYhqw4aJmRUBBc65uLjZw8xOB2qAh5xzs/3X/gMod87d4X/B5zjnbo5mO4dCD+t+G1DjnPtJNNs21MxsDDDGObfOzDKBtcBngMuJj23f0/p/ngFu/1jbg18AbHXObXfONQGPAxdEuU0yRJxzK4HyTi9fAPzWH/4t3gc/cHpY97jgnNvrnFvnD1cDG4FxxM+272n9ByzWAn4csCvieTFHuOIxygF/NbO1ZvbVaDcmSvKdc3v94X1AfjQbEwXfMLN3/C6cQHZRRDKzycBHgNeJw23faf1hgNs/1gI+3n3UOTcf+CRwjX8YH7ec178YO32MR+9eYBowD9gL/DSqrRliZpYBPAV80zlXFTkuHrZ9N+s/4O0fawG/G5gQ8Xy8/1pccM7t9n8eAP6A12UVb/b7fZTtfZUHotyeYeOc2++ca3XOtQG/IsDb38xCeOH2qHPu9/7LcbPtu1v/I9n+sRbwa4AZZjbFzJKBi4A/RrlNw8LM0v0TLphZOnAO8F7v7wqkPwLL/OFlwNNRbMuwag8332cJ6PY3MwMeADY6534WMSoutn1P638k2z+mrqIB8C8N+jmQCDzonLs9ui0aHmY2FW+vHSAJ+O+gr7uZPQYswSuVuh/4HrAceBKYiFdu+vPOucCdjOxh3ZfgHZ47oAi4KqJPOjDM7KPAKuBdoM1/+Ra8fuh42PY9rf/FDHD7x1zAi4hI/8RaF42IiPSTAl5EJKAU8CIiAaWAFxEJKAW8iEhAKeBFRAJKAS8iElD/H7T78lPYCQ1qAAAAAElFTkSuQmCC\n" | |
}, | |
"metadata": { | |
"needs_background": "light" | |
} | |
}, | |
{ | |
"output_type": "display_data", | |
"data": { | |
"text/plain": [ | |
"<Figure size 432x288 with 1 Axes>" | |
], | |
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAX4AAAEICAYAAABYoZ8gAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAAsTAAALEwEAmpwYAAA+ZklEQVR4nO3dd3yUVfb48c9JT0gIkEYnQAgBDKGDSkddFAUbCq6Ftaz6Xes23aauu+5vi7tucZur7ioWdFVcUIqFaqEEpCRAQoAASSAVUkmd+/tjJnEIk2SSTDLJzHm/Xnkx87Q5Tyaceebe+5wrxhiUUkp5Dx93B6CUUqpzaeJXSikvo4lfKaW8jCZ+pZTyMpr4lVLKy2jiV0opL6OJX7mUiGwSkbs76Ng/FpEXO+LYXUFH/u7aSkRiRcSIiJ+7Y1Guo4nfS4lIpoicE5Eyu5/n3R1XPRGZLSJZ9suMMb8yxnSpxKi+JiILROQzETkrIqdF5EURCXN3XOpCmvi92zXGmFC7nwfcHZDq1sKBXwL9gVHAAOB3bo1IOaSJX51HRAJtV2wX2S2Lsn07iBaR3iLygYjki8gZ2+OBTRzrKRF5ze75ec0GIvItETkoIqUiclRE7rUt7wGsBfrbfRvp7+B4C0Uk1RbvJhEZZbcuU0S+LyL7RKRYRN4SkaAm4vQRkZ+KyHERyRORV0UkvFHMd4jICREpEJGfNPP7C7ftn2873k9FxMe2zldEfm87xjERecBBM8pwEdkhIiUi8j8R6WN37NttxywUkZ/ZzvGyZmKZLiJf2H4/J0VkmZMxPmuL8SiwwMH5vSQip0QkW0R+KSK+AMaYN4wx64wxFcaYM8C/gEubik+5jyZ+dR5jTBXwHrDUbvFNwGZjTB7Wv5l/A0OAwcA5oK1NRHnA1UBP4FvAcyIywRhTDlwJ5Nh9G8mx31FE4oE3gUeAKGANsFpEAhrFPR8YCowFljURxzLbzxxgGBDq4JymAyOBecAT9h8yjfwF65XvMGAWcLvt3ADusZ3XOGACcK2D/W8H7gT6AbXAn23nOxr4G/BN27pwrFfUDonIEKwfnn/B+vsZB+xxMsargfHAJODGRof+jy2uONs2VwBNNb/NBFKbilG5kTFGf7zwB8gEyoCzdj/32NZdBhyx2/Zz4PYmjjMOOGP3fBNwt+3xU8BrdutiAQP4NXGs94GHbY9nA1mN1jccD/gZ8LbdOh8gG5htd3632q3/LfCPJl73U+D/7J6PBGoAP7uYB9qt3wEscXAcX6AaGG237F5gk+3xBuBeu3WX2f8+bL+7X9utH207ni/wBPCm3boQ27rLmjinHwEr2xjjfXbrrqiPEYgBqoBgu/VLgY0OXudy4AwQ7+6/df258Ed76r3btcaYTxws3wiEiMhUIBdrcl8JICIhwHNYr6R727YPExFfY0xda15cRK4EngTisSbuEGC/k7v3B47XPzHGWETkJOdfBZ+2e1xh26fFY9ke1ye6po4V6uA4kYC/g2PVx9QfOGm3zv6xo2XHbceLbLyvMaZCRArrn4tImd1+o4FBwBEXxGi/3RDbvqdEpH6ZT+PzEJFpwBvAjcaYdAcxKDfTph51AVsCfxvr1dxS4ANjTKlt9fewXhFPNcb0xPp1HkAuOBCUY03m9frWPxCRQOBd4FkgxhjTC2tzTf1xWiobm4M1EdUfT7Amu+wW9mvxWFibsGqxfui1RgHWbwqNj1Uf0ynAvj9kkINj2C8bbDteQeN9RSQYiKh/bs7vpD+BNRkPb2OMjWOodxLrFX+kMaaX7aenMWaMXVzjgVXAncaYTx28vuoCNPGrprwB3Iy1TfkNu+VhWNv1z9o6Hp9s5hh7gJkiMtjWWfoju3UBQCCQD9Tarv6vsFufC0TUd7I68DawQETmiYg/1g+kKuALJ8/P3pvAoyIyVERCgV8BbxljaltzELsPzGdEJMzWzv5doL5D+m3gYREZICK9gMccHOZWERlt+2b1NPCO7bjvANeIyCW2foyncPxhW+914DIRuUlE/EQkQkTGORnjQyIyUER6A4/bnd8p4CPg9yLS09YpPlxEZgGIdUDAOuBBY8zq1vzuVOfSxO/dVsv54/hX1q8wxmzHesXeH2snYb0/AsFYrxy3Yf2P7pAx5mPgLWAfsAv4wG5dKfAQ1kRzBrgF65Vi/fpDWBPyUduolPOaaYwxacCtWDsqC4BrsA5PrW7l7wDgZWA5sAU4BlQCD7bhONj2KweOAp9h/dB82bbuX1gT5z7gK6zfcGoB+yay5Vg7UE8DQVh/RxhjUm3HXoH1qrwMa+d4laMgbFf9V2H9QCzC+iGc5GSM64G9wG6sHf32bsf6oX0A6/v2DtbOZmyvFQW8ZPc3pZ27XZAYoxOxKOUOtm85/zDGDGlx4wv3DcXaIT/CGHPM1bEpz6ZX/Ep1EhEJFpGrbE0vA7A2k61saT+7/a8RkRCx3ufwLNaO8MyOiVZ5Mk38SnUeAX6OtYnkK+Ag1mGazlqEtSM6BxiBdUipfmVXraZNPUop5WX0il8ppbxMl7uBKzIy0sTGxro7DKWU6lZ27dpVYIyJcmbbLpf4Y2NjSU5OdncYSinVrYjI8Za3stKmHqWU8jKa+JVSysto4ldKKS/T5dr4HampqSErK4vKykp3h6K6kKCgIAYOHIi/v7+7Q1GqW+kWiT8rK4uwsDBiY2OxKwervJgxhsLCQrKyshg6dKi7w1GqW+kWTT2VlZVERERo0lcNRISIiAj9FqhUGziV+EVkvoikiUiGiDzuYH2gWOc0zRCR7SISa1v+TRHZY/djEZFxbQlUk75qTP8mlGqbFhO/bSLlv2KdK3Q0sNQ2/6e9u7BOvxeHdXam3wAYY143xowzxowDbgOOGWP2uC58pZTqerLOVLAu5ZS7w2iSM1f8U4AMY8xRW63zFViLRdlbBLxie/wOME8uvBxbatu325kzZw7r168/b9kf//hH7r///ib3mT17dsONaFdddRVnz569YJunnnqKZ599ttnXfv/99zlw4EDD8yeeeIJPPnE0W2LbPPLIIwwYMACLxeKyYyrl7f6+6Qj3vbabk0UV7g7FIWcS/wDOn1Mzi/PnNT1vG9usRcXYTQtnczPWiTUuICLfFpFkEUnOz893Ju5OtXTpUlasOP8za8WKFSxdutSp/desWUOvXr3a9NqNE//TTz/NZZdd1qZjNWaxWFi5ciWDBg1i8+bNLjmmI7W1rZrISqluLyWnBID/7WnLTKAdr1M6d22TdlcYY1IcrTfGvGCMmWSMmRQV5VSpiU5144038uGHH1JdbZ3cKTMzk5ycHGbMmMH999/PpEmTGDNmDE8+6XgWwtjYWAoKCgB45plniI+PZ/r06aSlpTVs869//YvJkyeTlJTEDTfcQEVFBV988QWrVq3iBz/4AePGjePIkSMsW7aMd955B4BPP/2U8ePHk5iYyJ133klVVVXD6z355JNMmDCBxMREDh065DCuTZs2MWbMGO6//37efPPrz+Tc3Fyuu+46kpKSSEpK4osvrLMZvvrqq4wdO5akpCRuu+02gPPiAQgNDW049owZM1i4cCGjR1tbBq+99lomTpzImDFjeOGFFxr2WbduHRMmTCApKYl58+ZhsVgYMWIE9RcBFouFuLg4uuJFgVKN1dRZOHjKmvjf+yqbrlgB2ZnhnNmcP/nyQC6c0Lp+mywR8QPCgUK79Uto4mq/tX6+OpUDtk9TVxndvydPXjOmyfV9+vRhypQprF27lkWLFrFixQpuuukmRIRnnnmGPn36UFdXx7x589i3bx9jx451eJxdu3axYsUK9uzZQ21tLRMmTGDixIkAXH/99dxzzz0A/PSnP+Wll17iwQcfZOHChVx99dXceOON5x2rsrKSZcuW8emnnxIfH8/tt9/O3//+dx555BEAIiMj2b17N3/729949tlnefHFFy+I580332Tp0qUsWrSIH//4x9TU1ODv789DDz3ErFmzWLlyJXV1dZSVlZGamsovf/lLvvjiCyIjIykqKmrx97p7925SUlIahlu+/PLL9OnTh3PnzjF58mRuuOEGLBYL99xzD1u2bGHo0KEUFRXh4+PDrbfeyuuvv84jjzzCJ598QlJSEl3xokCpxjLyyqiutXDxsAi+PFrIvqxikgb1cndY53Hmin8nMMI2EXUA1iS+qtE2q4A7bI9vBDbUTxAhIj7ATXTT9v169s099s08b7/9NhMmTGD8+PGkpqae1yzT2NatW7nuuusICQmhZ8+eLFy4sGFdSkoKM2bMIDExkddff53U1OanKk1LS2Po0KHEx8cDcMcdd7Bly5aG9ddffz0AEydOJDMz84L9q6urWbNmDddeey09e/Zk6tSpDf0YGzZsaOi/8PX1JTw8nA0bNrB48WIiIyMB64dhS6ZMmXLeGPs///nPJCUlMW3aNE6ePMnhw4fZtm0bM2fObNiu/rh33nknr776KmD9wPjWt77V4usp1RWkZBcD8MP5Iwnw8+G93VlujuhCLV7xG2NqReQBrBMw+wIvG2NSReRpINkYswp4CVguIhlYJ3ZeYneImcBJY8xRVwTc3JV5R1q0aBGPPvoou3fvpqKigokTJ3Ls2DGeffZZdu7cSe/evVm2bFmbx5UvW7aM999/n6SkJP7zn/+wadOmdsUbGBgIWBO3ozb29evXc/bsWRITEwGoqKggODiYq6++ulWv4+fn19AxbLFYGprDAHr06NHweNOmTXzyySd8+eWXhISEMHv27GZ/V4MGDSImJoYNGzawY8cOXn/99VbFpZS7pOaUEBLgy9iBvbh8dAyr953ip1ePxt+369w25VQkxpg1xph4Y8xwY8wztmVP2JI+xphKY8xiY0ycMWaKfZI3xmwyxkzrmPA7T2hoKHPmzOHOO+9suNovKSmhR48ehIeHk5uby9q1a5s9xsyZM3n//fc5d+4cpaWlrF69umFdaWkp/fr1o6am5rwkFxYWRmlp6QXHGjlyJJmZmWRkZACwfPlyZs2a5fT5vPnmm7z44otkZmaSmZnJsWPH+Pjjj6moqGDevHn8/e9/B6Curo7i4mLmzp3Lf//7XwoLrS149U09sbGx7Nq1C4BVq1ZRU1Pj8PWKi4vp3bs3ISEhHDp0iG3btgEwbdo0tmzZwrFjx847LsDdd9/NrbfeyuLFi/H19XX63JRyp5TsYkb364mvj3D9+AEUlVezOa1r9U91nY+gbmDp0qXs3bu3IfEnJSUxfvx4EhISuOWWW7j00kub3X/ChAncfPPNJCUlceWVVzJ58uSGdb/4xS+YOnUql156KQkJCQ3LlyxZwu9+9zvGjx/PkSNHGpYHBQXx73//m8WLF5OYmIiPjw/33XefU+dRUVHBunXrWLBgQcOyHj16MH36dFavXs2f/vQnNm7cSGJiIhMnTuTAgQOMGTOGn/zkJ8yaNYukpCS++93vAnDPPfewefNmkpKS+PLLL8+7yrc3f/58amtrGTVqFI8//jjTplmvBaKionjhhRe4/vrrSUpK4uabb27YZ+HChZSVlWkzj+o26iyGA6dKuGhAOAAz46OI6BHAyq+61uieLjfn7qRJk0zjiVgOHjzIqFGj3BSRcpfk5GQeffRRtm7d2uQ2+rehupKMvDIu+8NmfnfjWBZPso6JeWpVKm/sOMHOn1xGeHDHFRQUkV3GmEnObKtX/KpL+vWvf80NN9zA//t//8/doSjltNQca8du/RU/wPUTBlBda2HN/q5zJ68mftUlPf744xw/fpzp06e7OxSlnJaSXUyAnw9x0aENyxIHhDM8qgcrd3ed5p5uk/i7WpOUcj/9m1BdTUp2CaP6hp03gkdEuH7CQHZkFnWZEg7dIvEHBQVRWFio/9FVg/p6/EFBQe4ORSnA+jeZklPMGLtmnnrXjrdWuekqnbzdYiKWgQMHkpWVpbfsq/PUz8ClVFdwsugcpZW1XNT/wsQ/oFcw04b1YeVX2Tw4N87tJcW7ReL39/fXWZaUUl1aSkPHbk+H668fP5AfvruPPSfPMn5w784M7QLdoqlHKaW6upTsYvx8hPiYMIfrr0zsS6CfT5do7tHEr5RSLpCSU0J8TBhB/o7vMg8L8ueKMX1ZtTeH6lr3zn+hiV8ppdrJGENqdnGTzTz1rh8/gLMVNWxKy+ukyBzTxK+UUu10uqSSwvLq827ccmTGiEgiQ91fwkETv1JKtVNKtnWOkDEORvTY8/P14Zqk/nx6MI/iCscFDTuDJn6llGqnlOxifARG9XPcsWvv+vEDqa6z8MH+nE6IzDFN/Eop1U6pOcUMjwolJKDlEfIXDejJiOhQt5Zw0MSvlFLtlJJd0mL7fj0R4boJA0g+foYThe4p4aCJXyml2iG/tIrTJZWM6d/8iB57144bgIj7Sjho4ldKqXZwVIq5Jf17BTNtaATvfZXllhpkmviVUqodUnOsI3pGt+KKH6x1+o8XVrD7xNkOiKp5mviVUqodUrKLiY0IoWdQ62bXujKxH0H+Pqz8KquDImuaJn6llGqHpkoxtyQ00I8rRvflg32nqKqt64DImqaJXyml2uhsRTUni845LMXsjOsmWEs4bDzUuSXnNfErpVQb1bfvt1Sjpykz4iKJDA3s9OYeTfxKKdVGKdnWET0tlWpoip+vD4vG9WfDoTzOVlS7MrRmaeJXSqk2SskpYUCvYPr0CGjzMa4bP4CaOsMH+065MLLmaeJXSqk2Ss0ubtWNW46M6d+T+JhQ3tvdec09TiV+EZkvImkikiEijztYHygib9nWbxeRWLt1Y0XkSxFJFZH9IqKzYyulur3SyhqOFpST2IYRPfZEhOsnDGT3ibNkFpS7KLrmtZj4RcQX+CtwJTAaWCoioxttdhdwxhgTBzwH/Ma2rx/wGnCfMWYMMBtwXy1SpVSnqKyp48a/f8HyLzPdHUqHOXiqFGjdHbtNWTSuf6eWcHDmin8KkGGMOWqMqQZWAIsabbMIeMX2+B1gnlinkb8C2GeM2QtgjCk0xnTugFWlVKd76bNjJB8/w4qdJ90dSodp6Nht44gee/3Cg7lsVAzVdZ0zJWPLNURhAGD/7mUBU5vaxhhTKyLFQAQQDxgRWQ9EASuMMb9t/AIi8m3g2wCDBw9u7TkopbqQU8XneH5DBiEBvqTmlJBbUklMT89r4U3JKSY6LJDoMNec2wu3TcR6vdzxOrpz1w+YDnzT9u91IjKv8UbGmBeMMZOMMZOioqI6OCSlVEf61ZpDWIzhT0vGA7h9ftmOktqKUszO6KykD84l/mxgkN3zgbZlDrexteuHA4VYvx1sMcYUGGMqgDXAhPYGrZTqmrYfLWT13hzunTWcy0ZF0y88iA2HPC/xn6uu43BeKRe1c0SPuziT+HcCI0RkqIgEAEuAVY22WQXcYXt8I7DBWGuNrgcSRSTE9oEwCzjgmtCVUl1JbZ2FJ1elMqBXMPfPGo6IMCchms8OF3RYLZo6i6GypvO7DQ+dLsFiaFONnq6gxcRvjKkFHsCaxA8CbxtjUkXkaRFZaNvsJSBCRDKA7wKP2/Y9A/wB64fHHmC3MeZDl5+FUsrt3txxgkOnS/nJglEEB/gCMHdkNOXVdew8dqZDXvN369OY+IuP+d+ezp3QJKWhVEP3TPzOdO5ijFmDtZnGftkTdo8rgcVN7Psa1iGdSikPdaa8mmc/SufiYRFceVHfhuWXxEUQ4OfDhkN5TB8R6dLXrLMY3tmVRXWdhYdX7GHb0UKevGYMQf6+Ln0dR1Kzi+kd4k//8O7Zaa137iql2u3Zj9Ioq6rlqYVjzuukDAnw4+JhEWzsgA7e7ccKKSir4tnFSdw3azhv7jjJtX/9nIy8Mpe/VmMpOcVcNCC8UztkXUkTv1KqXVKyi3ljxwlumzaEkX3DLlg/NyGaYwXlHHPxXakf7jtFsL8vV4zuy+NXJvCfb00mr7SKa/7yGe/u6rjyB9W1FtJOl7a5MFtXoIlfKdVmxhh+vjqV3iEBPHp5vMNt5oyMBmCjC0f31NZZWJdymnmjohv6E2aPjGbNQzNIHBjO9/67l+//dy8V1bUue8166bml1NSZNpdi7go08Sul2mzV3hx2Zp7hh98YSXiw46kHB0eEMDyqh0ube7YdLaKwvJqrx/Y/b3nf8CDeuHsqD82N493dWSx8/nPSTpe67HXBbnJ1veJXSnmb8qpafrXmIGMHhnPTpEHNbjs3IZrtR4sor3LNFfgH+3LoEeDL7JEX3vDp5+vDd68YyfI7p3K2ooaFz3/GWztPYB1h3n4p2SWEBfoxuE+IS47nDpr4lVJt8vzGDHJLqnhq4Rh8fJrv5JyTEE11nYXPMgra/bo1dRbWpZ7m8tExzY7gmT4ikjUPT2dSbG8ee3c/j761hzIXfPCk5BQzun/PFs+5K9PEr5RqtWMF5by49Sg3TBjIhMG9W9x+cmwfwgL9XNLO/3lGAWcraljQqJnHkeiwIF69cyrfuzyeVXtzWPiXzxqaatqits7CwVOuLdXgDpr4lVKt9osPDhDo58tj80c6tb2/rw8z4iPZmJbX7iaXD/edIizQj5nxzt0X4OsjPDhvBG/cM43y6lqu+9sXbb7h62hBOZU1lm7dsQua+JVSrbThUC4bDuXx0Lw4oltRdXPOyGhyS6oaJihvi+paC+tTT3P5mBgC/Vp3o9a0YRGseWgG4wb14vv/3cv2o4Wtfv36UsztnXzF3TTxK6WcVlVbx9OrDzAsqgfLLhnaqn1n2Tpi29Pc81lGPiWVtVw9tl+b9o8IDeRft09icJ8Q7n1tF8cLW3dvQUp2CcH+vgyNDG3T63cVmviVUk57+bNMMgsrePKaMQT4tS59RIcFMXZgeLuGdX6w7xQ9g/yYHtf28u3hwf68dMdkAO78z06Kzzk/KWB9x65vN+7YBU38SiknnS6u5C8bDnP56Bhmxbct8c4ZGc1XJ89SVF7d6n0ra+r4ODWXb4zp2+oPncZiI3vwj1sncqKoggfe2E2NEzNfWSyGAzkl3bYUsz1N/Eopp/x67UFqLYafLWg85bbz5iZEYwxsTm/9Vf/WwwWUVtVydVLLo3mcMW1YBM9cl8jWwwX8fHVqi53OmYXllFXVdttSzPY08SulWrT1cD7v78nh3pnDGBzR9huXEgeEExkayIZD+a3e94N9OfQO8eeS4RFtfv3Gbpo0iHtnDeO1bSd45YvMZrdtKMXcje/YredUWWallPcxxrD9WBH/3HyEjWn5DOwdzP2zh7frmD4+wuyRUXyUepraOgt+vs5de1bW1PHJgVwWjuuPv5P7OOuxbyRwNL+cpz84wJDIHg21hRpLzS4mwNeHETHdu2MX9IpfKdVIncWwLuUU1/3tC5a8sI19WcV87/J4PnhwOiEB7b9WnJsQTUllLbtPnHV6n01p+ZRX17Eg0TXNPPZ8fIQ/3jyOhL49efCNr5qs7ZOSU0xCvzCXf/C4Q/c/A6WUS1TW1PHmjhNc/ofN3PfaborKq/nFtRfx+eNzeXDeCHqFBLjkdaaPiMTPR1o1F+8H+3KI6BHAtGF9XBJDYz0C/Xhp2SSCA3y565WdFJRVnbfeGENKdkm3LsVsTxO/Ul6u+FwNf9uUwYzfbuRH7+0nJNCX528Zz4bvzeK2aUNcPqNVzyB/JsX2ZpOTwzrPVdfx6cE85l/U1+mmobboFx7Mi7dPIr+0inuX7zpvLt+sM+coPlfT7e/YraeJXykvdbq4kl+tOcilv97Ab9elkdA3jNfvnsrqB6Zz9dj+HZpk5yZEc+h0Kdlnz7W47YZDeZyrqbugBHNHSBrUiz/cNI5dx8/wo/f2N4z08YRSzPa0c1cpL1NWVcvTq1NZ+VU2dRbDgrH9uXfmsE4tPDY3IZpfrTnExkN53DptSLPbfrg/h8jQQKYM7ZhmnsYWjO3Hkfx4/vBxOsOjevDA3BGkZJfg6yMOZxjrjjTxK+VlfrfuEO/syuLWaUO4e3r7hme21fCoUAb1CW4x8ZdX1bLhUB43TRrUqXfLPjg3jiP5ZTz7UTrDokJJySlmRHRop0zk3hk08SvlRQ7klLB823FunTaEpxdd5LY4RIS5I6N5K/kklTV1TSbUTw/lUVlj6ZRmnsbx/eaGsZwsquDRt/bg7+vD/Iv6dmoMHUnb+JXyEsYYnlqVSq+QAL7bxPy4nWlOQjSVNRa+bKZK5of7cojpGcikIS3X/He1IH9f/nnbJCJDAymrqvWIUg31NPEr5SVW7c1hR2YRP/jGSJcNzWyPacMiCPL3abJaZ2llDRvT8rkqsZ/bZruKCgvk5WWTmTikN7ObuLGrO9LEr5QXKGvF/LidJcjfl0uHR7LhkOPJWT49mEd1raXNJZhdZWTfMN69/xJiI3u4NQ5Xcirxi8h8EUkTkQwRedzB+kARecu2fruIxNqWx4rIORHZY/v5h4vjV0o54S8bDpNbUsXPF47pUiWF5yREk3XmHEfyyy5Y98G+HPqHBzF+UOc383i6FhO/iPgCfwWuBEYDS0WkcXm+u4Azxpg44DngN3brjhhjxtl+7nNR3EopJx3JL+Plz46xeOJAxjsxP25nmpNgbT5pfBdv8bkatqQXuLWZx5M5c8U/Bcgwxhw1xlQDK4BFjbZZBLxie/wOME9E9N1Sys2MMfx89QGC/Hz54fwEd4dzgQG9gknoG3ZB4v/4QC7VdRaXlWBW53Mm8Q8ATto9z7Itc7iNMaYWKAbqa6cOFZGvRGSziMxw9AIi8m0RSRaR5Pz81pdrVUo59vGBXLak5/PI5fFEhQW6OxyH5iREk5x5hpLKr2fC+nBfDgN7B5M00DPulO1qOrpz9xQw2BgzHvgu8IaIXDAmyhjzgjFmkjFmUlRU26dUU0p9rbKmjl98eID4mFBuv7j5u2PdaW5CNLUWw9b0AgCKK2rYeriABWP7oQ0HHcOZxJ8N2A8DGGhb5nAbEfEDwoFCY0yVMaYQwBizCzgCuH8AsVJe4J+bj3Ky6BxPLRzTpUsJjx/Ui/Bg/4bmnvWpp6m1GK7ugBLMysqZv4adwAgRGSoiAcASYFWjbVYBd9ge3whsMMYYEYmydQ4jIsOAEcBR14SulOfJL63ibEXr56Nt7GRRBX/blMGCsf24ZHikCyLrOH6+PsyMj2Jzeh4Wi+GD/acY3CfEYyphdkUtlmwwxtSKyAPAesAXeNkYkyoiTwPJxphVwEvAchHJAIqwfjgAzASeFpEawALcZ4wp6ogTUao7KqmsYfvRIj7PKOCLIwWk55YREuDLzxeO4caJA9vc1PHMhwfxEeEnV41yccQdY25CFKv35rD5cD6fZxRw78xh2szTgZyq1WOMWQOsabTsCbvHlcBiB/u9C7zbzhiV8hiVNXXsPn6Gz48U8HlGIfuyzmIxEOTvw+TYPlw3fiCb0/P4wTv72HK4gF9eexHhwf6teo2th/NZl3qaH3xjJP17BXfQmbjWrPhoROCpVam2iqHuvWnL02mRNqU6UJ3FsD+7uOGKPjnzDFW1Fnx9hKSB4XxnThyXxkUyfnAvAv2shcq+PXMY/9h8hD98nM7u42f489JxTBziXEni6loLT61KZUhECHfPGNqRp+ZSfXoEMH5QL3afOMuwyB6M7qfNPB1JE79SLmCMobC8mvTcUg7nljX8e/B0CaWVtQAk9A3jm1OHcGlcBFOG9iEsyPGVvK+P8J05cVwyPIKHVnzFTf/cxkNzR/DA3LgW77p95YtMjuSX8/KySQ0fJN3F3IRodp84q6N5OoEmfqVaqbCsivTcMg7nlZKeW2p9nFvKmYqvx6GHBfkRHxPGNUn9mTYsgouHRbR6HP34wb1Z89AMfvZ+Cs99ks7nGQU8t2QcA5povskrqeRPnx5mbkI0cxNi2nWO7rAwaQBrU06zeGLXqCXkycRRcSR3mjRpkklOTnZ3GEqdp7Csih+v3E9y5hkKy78edRMW6MeImFDiY8IYERPGiGjr45iegS69al35VRY/XZmCr4/w6xvGclXihW3g331rDx/sO8VHj870qIJiyjkisssYM8mZbfWKX6kWnCis4I5/7yDn7DkWjevfkOTjY0Lp2zOoU5olrhs/kAmDe/PQij383+u7WTJ5EE9cM5qQAOt/4eTMIt77KpvvzBmuSV+1SBO/Us1IyS5m2b93Umux8MY9U53uZO0IQyJ68M59F/Pcx+n8ffMRdmQW8ecl4xnVrydP/C+VfuFBfGdOnNviU92HJn6lmvDZ4QLuXZ5Mr5AAVtw5lbho90+07e/rww/nJzA9LpJH397D9X/7gtkjozhwqoTnbxnf8A1AqeZ03fu4lXKj/+3J5lv/2cGgPiG893+XdImkb++SuEjWPTyTWSOj+OhALhcPi2CBg3Z/pRzRywOlGvnXlqM8s+YgU4f24YXbJ7X6BqrO0rtHAC/cNpGNaXkkDuilQyCV0zTxK2VjsRh+teYgL352jAWJ/fj9TUkE+XftsfAi0i2Hbir30sSvFNY7Xr//372s2pvDsktieeLq0Trzk/JYmviV1yutrOG+13bxeUYhj81P4L5ZWiBMeTZN/Mqr5ZVWsuzlnaTnlvL7xUncMHGgu0NSqsNp4lde62h+GXf8eweFZdW8eMckZo+MdndISnUKTfzK61TV1rElvYDH3t2HAG/eM42kQb3cHZZSnUYTv/IKJ4sq2JSez6ZDeXxxpJBzNXUM7hPCq3dO0RIHyuto4lceqaq2jp3HzrApLY+NaXkcyS8HYHCfEBZPGsjskVFcMjyyyw/XVKojaOJXHiPrTAWb0vLZlJbPF0cKqKiuI8DPh6lD+3DL1CHMHhnFsMgeOmJHeT1N/KrbS88t5cE3viIttxSAgb2DuWGC9ar+4uERWr9GqUb0f4Tq9v70yWFyis/x0wWjmD0ymuFRelWvVHM08atuLa+kkvWpp1l2SSx3zxjm7nCU6ha0Oqfq1t7aeZJai+Gb04a4OxSlug1N/Krbqq2z8OaOE0yPi2SoDslUymma+FW3teFQHjnFldyqV/tKtYomftVtvbb9BH17BnHZKC21oFRrOJX4RWS+iKSJSIaIPO5gfaCIvGVbv11EYhutHywiZSLyfRfFrbzc8cJytqTns2TKIPx89fpFqdZo8X+MiPgCfwWuBEYDS0VkdKPN7gLOGGPigOeA3zRa/wdgbfvDVcrqje0n8PURlkwe7O5QlOp2nLlUmgJkGGOOGmOqgRXAokbbLAJesT1+B5gntoHUInItcAxIdUnEyutV1tTxdvJJrhgdQ9/wIHeHo1S340ziHwCctHueZVvmcBtjTC1QDESISCjwGPDz9oeqlNWa/ac4U1GjnbpKtVFHN44+BTxnjClrbiMR+baIJItIcn5+fgeHpLq717YdZ1hkDy4ZHuHuUJTqlpxJ/NnAILvnA23LHG4jIn5AOFAITAV+KyKZwCPAj0XkgcYvYIx5wRgzyRgzKSoqqrXnoLxIak4xu0+c5ZvThmhZBqXayJmSDTuBESIyFGuCXwLc0mibVcAdwJfAjcAGY4wBZtRvICJPAWXGmOddELfyUq9tO0GQvw83TtApEpVqqxYTvzGm1naVvh7wBV42xqSKyNNAsjFmFfASsFxEMoAirB8OSrlUaWUN/9uTzTVj+xMe4u/ucJTqtpwq0maMWQOsabTsCbvHlcDiFo7xVBviU6rByq+yqaiu005dpdpJ73xR3YIxhte2HWfswHCdH1epdtLEr7qFHceKSM8t49aperWvVHtp4lfdwmvbT9AzyI9rkvq7OxSluj1N/KrLyy+tYl3KKW6YOJDgAJ0cXan20sTvIf7z+TH+t6fx7RWe4e3kk9TUGe3UVcpFdOpFD3A4t5SnPzhAVFgg14ztj4+P59zYVGcxvLH9BJcMj2B4VKi7w1HKI+gVvwf43fo0LAZyS6rYfeKMu8NxqU1peWSfPadX+0q5kCb+bm7X8TN8dCCXe2cNI8DPhw/2nXJ3SC712rbjRIcFcvnoGHeHopTH0MTfjRlj+M26Q0SGBvLwvBHMjo9ibcopLBbj7tBc4mRRBZvS81kyZTD+OtmKUi6j/5u6sU1p+ew4VsTD8+IICfBjwdh+HtXc8/r2E/iIsHTKoJY3Vko5TRN/N1VnsV7tD4kIYckU6yxU80bFeExzT1WtdbKVeQnR9AsPdnc4SnkUTfzd1P/2ZHPodCnfu2JkQzNIaKBfhzb3GGMoq6p1+XEdWZdymqLyam67WDt1lXI1r078G9Py+M7ruzstmblKVW0dv/8onYsG9OTqxH7nratv7tnVAc09y7cdZ+ozn1BUXu3yYzf22rbjxEaEcOnwyA5/LaW8jVcn/lV7cvhw/ym+/WoylTV17g7Haa9vO0H22XM8Nj/hgjH79c09H7q4ucdiMby49Rjl1XVsPJTn0mM3duh0CTszz/DNqUM86p4EpboKr0786bmlRIYG8sWRQh5ZsYe6bjAaprSyhuc3ZnBpXAQzRlw4W1looB9zRrq+uWfz4XxOFFUgAp8eynXZcR15bdtxAvx8uHGiTraiVEfw2sRfZzFk5JWxaFx/fnb1aNalnuan7+/HOnFY1/WvrccoKq/msfkJTW5zVaLrm3uWf3mcyNBAbpgwkC3pBVTXWlx2bHsV1bWs3G2dbKV3j4AOeQ2lvJ3XJv4TRRVU1VoYGRPGXdOH8p05w3lzx0me/SjN3aE1Kb+0ihe3HmVBYj/GDuzV5HbzRsUQ6MLmnpNFFWxMy+OWKYOYP6YvZVW1bD9W6JJjN7YpLZ/y6jq92leqA3lt4k/PLQVgRIy1/sv3rxjJ0imD+evGI7y49ag7Q2vSXzYcpqrWwveuiG92u9BAP2aPjGLNftc097y27bh1PP3UwVwaF0mgnw+fHuyYdv61KaeJ6BHAlKF9OuT4SikvTvyHGxJ/GAAiwi+vvYgrL+rLLz88yLu7stwZ3gWOF5bzxvYT3Dx5EMOcKFa2YGx/8kqrSD7evuaeypo63ko+yeWjYugXHkxwgC/T4yL55GCuy5vFKmvq2HAwlyvG9MVXO3WV6jBem/jTcssY0CuY0MCvC5T6+gh/XDKOS+Mi+OG7+/jkQPs6MSuqa/nzp4eZ/Mwn/OHjdGrr2t4u/vuP0vH39eGReSOc2n5eQjSBfj6s2d++5p4P9p3ibEUNt9uNp583KoasM+dIzy1r17Eb23q4gPLqOq68qK9Lj6uUOp/XJv7DuaXEx1x45Rzo58s/b5vEmP49+c4bu9lxrKjVx66ts7Bixwlm/24Tf/g4ncjQQP786WFu/MeXZBaUt/p4KdnFrNqbw53TY4nuGeTUPj0C/ZgzMrrdzT3Lv8wkLjqUi4dHNCybNyoagE8OunZ0z9r9pwgP9j/vtZRSrueVib+2zsLR/HLibc08jYUG+vHvZZMZ0DuYu17ZyYGcEqeOa4xhw6FcrvrzVh5/bz8Dewfzzn0Xs/bhGTx/y3iO5pdx1Z+38tbOE61qJvnt+jR6hfhz76zhTu8DcNXYfu1q7tl78ix7s4q5bdoQRL5ueonpGUTigHA+dWHir6618PHBXC4fHaMF2ZTqYF75PyyzsILqOkuTiR8gIjSQ5XdNJTTQjzv+vYMThRXNHnNf1llu+dd27vxPMjV1hn/cOoF377+ESbHWTsqrx/Zn/aMzGTeoF4+9u5/7Xtvl1B2wXxwpYEt6Pt+ZHUfPIP9WnWd9c8+H+3JatV+9V788To8AX66fMODCY4+K5quTZykoq2rTsRv7/EgBpZW1XJWozTxKdTSvTPz1I3qaS/wAA3oFs/yuKdTUWbj1pe3klVZesM3JogoeevMrFj7/OWm5pTy9aAwfPTqT+Rf1O+8qGaBfeDCv3TWVn1w1io2H8pn/xy1sSc9v8vWtZZfT6B8e1KaaNfXNPWtTTrf65rSi8mpW78vhugkDCHPwgXPZqBiMwWV38a7bf5qwQD8ujdMSDUp1NK9N/CIQF93y6Ji46DD+vWwyBWVV3P7SDorP1QBwtqKaZz48wLzfb2Z96mm+M2c4m38wm9svjm22qcLHR7hn5jDe/86lhAf7c/vLO3hqVarDkhHrUk6z9+RZHrk8niD/tk0yvqC+uSezdX0VbyefpLrWwm3TYh2uH9O/J/3Cg1wyrLO2zsJHB04zd1Q0gX46mbpSHc0rE//h3DIG9wkhOMC5JDN+cG/+cetEjuSXcfcrO3lhyxFm/nYjL352jEXj+rPpB7P5wTcSHF4ZN2V0/56sfnA6yy6J5T9fZLLw+c/O60uorbPwu/VpjIgO5YYJbb+ZaW4bRvfUWQyvbz/OlKF9GNnX8bciEWFuQjRbD+dTVdu+OkfbjxVxpqKGKy/q1/LGSql2cyrxi8h8EUkTkQwRedzB+kARecu2fruIxNqWTxGRPbafvSJynYvjb5O03FJGRDffzNPYzPgo/nDTOJKPn+FXaw4xfnBv1j48g98tTmpzvfggf1+eWjiGV+6cwpmKGq796+f8a8tRLBbD28lZHC0o54fzE9o1pr1HoB9zE6JZ04rmns3peZwsOnfeEE5HLhsVQ3l1HduOtn7kk721KacI9vdlVvyFtYeUUq7n19IGIuIL/BW4HMgCdorIKmPMAbvN7gLOGGPiRGQJ8BvgZiAFmGSMqRWRfsBeEVltjHFbHeTqWguZBeVc0YY5XK9J6k9YkB+Bfr4uHXI4Kz6K9Y/M5PF39/HMmoNsTMvjSH4Zk4b05jLb0Mn2uCqxH2tTTpOcWcTUYS3H/eqX1nluvzGm+Y7Wi4dHEOzvy6cHc9uctOsshnUpucxNiHb6G5hSqn2cueKfAmQYY44aY6qBFcCiRtssAl6xPX4HmCciYoypsEvyQYDbK6AdKyin1mJa7NhtyuyR0R0yzrxPjwD+edtEfnNDIntOniW3pIrHrky4oIO4LeYmRBPk78OHTjT3HC8sZ3N6PkudmOc2yN+X6SMi+fRgXpvv4t11/AwFZVXM15u2lOo0ziT+AcBJu+dZtmUOt7El+mIgAkBEpopIKrAfuM/R1b6IfFtEkkUkOT+/6VEuruDsiB53EBFunjyYtQ/P4IXbJjI51jX1alozuqehLo9tOseWXDYqmuyz5zh4qrRNsa1NOUWAnw9zEtr/zUYp5ZwO79w1xmw3xowBJgM/EpELbj01xrxgjJlkjJkUFdWx7bzpuaX4CAyL6tGhr9MeQyJ6cEULzSyttWBsP/JLq9jZzOiec9V1vJ2cxTfGxNA33Lk7hOsTdltu5rJYDOtSTjMrPuq80hlKqY7lTOLPBgbZPR9oW+ZwGxHxA8KB8+r2GmMOAmXARW0N1hXSc0uJjejR5uGR3VV9c09zo3tW78uh+FxNk0M4HYkOCyJpUC8+acN4/r1ZZzlVXKm1eZTqZM4k/p3ACBEZKiIBwBJgVaNtVgF32B7fCGwwxhjbPn4AIjIESAAyXRJ5Gx3OLeuSzTwdLSTANrpnv+PmHmMMy788TnxMKNOGta6J6bKEaPaePOvwBrfmrE05jb+vMG9U6zvalVJt12Lit7XJPwCsBw4CbxtjUkXkaRFZaNvsJSBCRDKA7wL1Qz6nYx3JswdYCfyfMabAxefgtMqaOjILyx0WZ/MGVyX2o6DMcXPPnpNn2Z99YV0eZ9Qn7tbcxWuMYW3KKS6NiyQ8uHWlKJRS7eNUw6oxZg2wptGyJ+weVwKLHey3HFjezhhd5kh+GRbzdQ1+b9MwumffKaY1Gta53FaX59rxF9blacmofmH0Dw/ik4N53DzZuU7h1JwSThad44E5ca1+PaVU+3jVnbuHbfXjm7ob1dPVN/c0Ht1TWFbFB/tOcf2Ega26+7ieiLW55rPDBQ5LTziyNuUUvj7C5aO1fV+pzuZViT89txQ/HyE2ouuO6OloCxL7U1BWdd48A28ln6S6ztKmQnD15o2K5lxNHV8eaXkuXmMMa/efZtqwPvTRCdWV6nRel/iHRvYgwM+rTvs8cxKizhvdU2cxvL7tBNOG9WlXp/e0YRGEBPg6NTlLem4ZRwvKtTaPUm7iVRkwPbeMeC9t5qkXEuDHvISYhuaejYfyyD57jtsvjm3XcYP8fZkxIpINh1q+i3dtyilE4IoxOppHKXfwmsRfUV3LyTMVxLeyOJsnqh/ds+NYEa9uO05Mz0Aub0PtosbmjYrhVHElqS3MWLZ2/2kmD+lDdJhzN4kppVzLaxJ/Rl4ZxuC1Qznt1Tf3/G1TBlucrMvjjLkJ0YjQbI3+o/llpOWWcqXOtKWU23hN4k+3jejx1qGc9uqbe7YeLsDPR7jFybo8LYkMDWTcoF58eqjpdv61KacBtCibUm7kNYn/cG4pAb4+xEaEuDuULuGqRGvH6jcu6kt0T9c1uVw2KoZ9WcXklji+i3dtyinGD+7V5jkMlFLt5zWJPz23lGFRPfBzQZOGJ5g3Kprrxg/g4XkjXH5cgA0O7uI9WVRBSnaJ1uZRys28Jgume2mNnqYE+fvy3M3jXP47GRkTxoBewQ6rda5NsQ4h1WGcSrmXVyT+sqpass+e89o7djuTiHD56Bg+y7jwLt61Kae5aEBPBvXR5jal3MkrEv9h2+QrI6J1RE9nmDcqmsoaC59nfF2P71TxOb46cVav9pXqArwi8XflWbc80dShEYQG+vGJ3bDOdbbRPNq+r5T7eUniLyPI30ebGDpJgJ8PM+Mj2XAot+Eu3rUppxkZE8awKP3WpZS7eUniLyUuOhRfn/ZPXK6cMy8hhtySKlKyS8grrWRnZpGO3Veqi/CKiU4P55ZxyfCIljdULjMnIRofgU8O5hIVFogxX987oJRyL49P/MXnajhdUun1xdk6W58eAUwY3JtPD+USHuzPsMgeWi5DqS7C45t6Djd07GrS6WzzRsWQkl3Cl0cKuTKxb6undFRKdQyPT/wNNXq0Kmenu8x2F6/F6E1bSnUlHt/Uk55bSkiALwN6aW2YzhYXHcrgPiFYjGFM/57uDkcpZeMViX9ETBg+OqKn04kIz908ruGxUqpr8ILEX8ackVHuDsNrTRzS290hKKUa8eg2/qLyagrKqvSOXaWUsuPRib+hVIMO5VRKqQYenfh1KKdSSl3IqcQvIvNFJE1EMkTkcQfrA0XkLdv67SISa1t+uYjsEpH9tn/nujj+ZqXnlhEW6EdfF84wpZRS3V2LiV9EfIG/AlcCo4GlIjK60WZ3AWeMMXHAc8BvbMsLgGuMMYnAHcByVwXujLTcUuL7humIEqWUsuPMFf8UIMMYc9QYUw2sABY12mYR8Irt8TvAPBERY8xXxpgc2/JUIFhEAl0ReEuMMRzOLdVmHqWUasSZxD8AOGn3PMu2zOE2xphaoBhoXBXtBmC3Maaq8QuIyLdFJFlEkvPz852NvVkFZdWcqajRO3aVUqqRTuncFZExWJt/7nW03hjzgjFmkjFmUlSUa8bc14/o0ekWlVLqfM4k/mxgkN3zgbZlDrcRET8gHCi0PR8IrARuN8YcaW/AzqpP/CO0qUcppc7jTOLfCYwQkaEiEgAsAVY12mYV1s5bgBuBDcYYIyK9gA+Bx40xn7soZqek55bRK8SfqNBO6VJQSqluo8XEb2uzfwBYDxwE3jbGpIrI0yKy0LbZS0CEiGQA3wXqh3w+AMQBT4jIHttPtMvPwoHDuaXER+uIHqWUasypWj3GmDXAmkbLnrB7XAksdrDfL4FftjPGVjPGkJZbyqJx/Tv7pZVSqsvzyDt3c0uqKK2s1Ro9SinlgEcm/oaOXR3KqZRSF/DoxK83byml1IU8NvFHhgYQoSN6lFLqAh6a+Mu0mUcppZrgcYm/vkaP3rGrlFKOeVzizz57jvLqOr1jVymlmuBxif9wbhmADuVUSqkmeFzibxjRo238SinlkMcl/rTcUmJ6BhIe4u/uUJRSqkvyuMR/OLdMm3mUUqoZHpX4LRZDRp4mfqWUao5HJf6sM+c4V1Ond+wqpVQzPCrxpzVMvqJX/Eop1RSPSvxfF2fTK36llGqKRyX+w7mlDOgVTFiQjuhRSqmmeFTiT8st0zt2lVKqBR6T+OsshiP5OqJHKaVa4jGJ/3hhOdW1Fk38SinVAo9J/BYDVyX2JXFAuLtDUUqpLs2pyda7g7joUP72zYnuDkMppbo8j7niV0op5RxN/Eop5WU08SullJfRxK+UUl7GqcQvIvNFJE1EMkTkcQfrA0XkLdv67SISa1seISIbRaRMRJ53cexKKaXaoMXELyK+wF+BK4HRwFIRGd1os7uAM8aYOOA54De25ZXAz4DvuyxipZRS7eLMFf8UIMMYc9QYUw2sABY12mYR8Irt8TvAPBERY0y5MeYzrB8ASimlugBnEv8A4KTd8yzbMofbGGNqgWIgwhUBKqWUcq0ucQOXiHwb+LbtaZmIpLXjcJFAQfuj6pb03L2XN5+/N587fH3+Q5zdwZnEnw0Msns+0LbM0TZZIuIHhAOFzgZhjHkBeMHZ7ZsjIsnGmEmuOFZ3o+funecO3n3+3nzu0Lbzd6apZycwQkSGikgAsARY1WibVcAdtsc3AhuMMaY1gSillOocLV7xG2NqReQBYD3gC7xsjEkVkaeBZGPMKuAlYLmIZABFWD8cABCRTKAnECAi1wJXGGMOuPxMlFJKOcWpNn5jzBpgTaNlT9g9rgQWN7FvbDviawuXNBl1U3ru3subz9+bzx3acP6iLTJKKeVdtGSDUkp5GU38SinlZTwm8bdUT8jTiUimiOwXkT0ikuzueDqSiLwsInkikmK3rI+IfCwih23/9nZnjB2pifN/SkSybe//HhG5yp0xdhQRGWSr/3VARFJF5GHbco9//5s591a/9x7Rxm+rJ5QOXI71zuKdwFJvGj1kGz01yRjj8TeyiMhMoAx41RhzkW3Zb4EiY8yvbR/8vY0xj7kzzo7SxPk/BZQZY551Z2wdTUT6Af2MMbtFJAzYBVwLLMPD3/9mzv0mWvnee8oVvzP1hJSHMMZswTps2J59vahXsP6H8EhNnL9XMMacMsbstj0uBQ5iLRnj8e9/M+feap6S+J2pJ+TpDPCRiOyylcDwNjHGmFO2x6eBGHcG4yYPiMg+W1OQxzV1NGYr/z4e2I6Xvf+Nzh1a+d57SuJXMN0YMwFr+ezv2JoDvJLtrvHu34bZOn8HhgPjgFPA790aTQcTkVDgXeARY0yJ/TpPf/8dnHur33tPSfzO1BPyaMaYbNu/ecBKrM1f3iTX1gZa3xaa5+Z4OpUxJtcYU2eMsQD/woPffxHxx5r4XjfGvGdb7BXvv6Nzb8t77ymJ35l6Qh5LRHrYOnsQkR7AFUBK83t5HPt6UXcA/3NjLJ2uPunZXIeHvv8iIlhLxBw0xvzBbpXHv/9NnXtb3nuPGNUDYBvC9Ee+rif0jHsj6jwiMgzrVT5Yy3C84cnnLyJvArOxlqPNBZ4E3gfeBgYDx4GbjDEe2QHaxPnPxvpV3wCZwL12bd4eQ0SmA1uB/YDFtvjHWNu6Pfr9b+bcl9LK995jEr9SSinneEpTj1JKKSdp4ldKKS+jiV8ppbyMJn6llPIymviVUsrLaOJXSikvo4lfKaW8zP8HR4iWwpkjGi0AAAAASUVORK5CYII=\n" | |
}, | |
"metadata": { | |
"needs_background": "light" | |
} | |
}, | |
{ | |
"output_type": "stream", | |
"name": "stdout", | |
"text": [ | |
"\n", | |
"Best model (it: 24): Loss: 3.1288, Accuracy: 0.0703, F1: 0.0822, Precision: 0.1112, Recall: 0.0703 \n" | |
] | |
} | |
], | |
"source": [ | |
"plt.title(f'Training {cfg.max_iter} iterations on {dataset.name}')\n", | |
"plt.plot(np.log(history['train_loss']), label=\"Training Loss (log scale)\")\n", | |
"plt.legend()\n", | |
"plt.show()\n", | |
"\n", | |
"plt.title(f'Evaluation on {dataset.name}')\n", | |
"plt.plot(history['val_acc'], label=\"Validation Accuracy\")\n", | |
"plt.legend()\n", | |
"plt.show()\n", | |
"\n", | |
"best_it = np.argmax(np.array(history['val_f1']))\n", | |
"print('')\n", | |
"print(f'Best model (it: {best_it+1}): '\n", | |
" f'Loss: {history[\"train_loss\"][best_it]:.4f}, '\n", | |
" f'Accuracy: {history[\"val_acc\"][best_it]:.4f}, '\n", | |
" f'F1: {history[\"val_f1\"][best_it]:.4f}, '\n", | |
" f'Precision: {history[\"val_metrics\"][best_it][\"precision\"]:.4f}, '\n", | |
" f'Recall: {history[\"val_metrics\"][best_it][\"recall\"]:.4f} ')" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "S49PVlqe9IIs" | |
}, | |
"source": [ | |
"# 6. Summary\n", | |
"\n", | |
"You have learned how graphs can be batched together for better GPU utilization, and how to apply readout layers for obtaining graph embeddings rather than node embeddings.\n", | |
"\n", | |
"We performed the experiments using GIN architecture on a Python code dataset. We highlight how providing structured information from the AST to the Message Passing framework results in better performance.\n", | |
"\n", | |
"Finally, you can refer to the [OGB leaderboard](https://ogb.stanford.edu/docs/leader_graphprop/#ogbg-code2) to help you implement more complex GNN architectures and extend this experiment." | |
] | |
} | |
], | |
"metadata": { | |
"colab": { | |
"provenance": [] | |
}, | |
"gpuClass": "standard", | |
"kernelspec": { | |
"display_name": "Python 3", | |
"name": "python3" | |
}, | |
"language_info": { | |
"name": "python" | |
}, | |
"accelerator": "GPU" | |
}, | |
"nbformat": 4, | |
"nbformat_minor": 0 | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment