Skip to content

Instantly share code, notes, and snippets.

@shravankumar147
Last active March 5, 2022 04:31
Show Gist options
  • Save shravankumar147/acdcd9799b008b96630afcba187bca46 to your computer and use it in GitHub Desktop.
Save shravankumar147/acdcd9799b008b96630afcba187bca46 to your computer and use it in GitHub Desktop.
GCN2_CORA.ipynb
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "GCN2_CORA.ipynb",
"provenance": [],
"collapsed_sections": [],
"authorship_tag": "ABX9TyMAdGGnUE6Z7yL2oOMeEWYR",
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/shravankumar147/acdcd9799b008b96630afcba187bca46/gcn2_cora.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"source": [
"# [GCN2_CORA](https://github.com/pyg-team/pytorch_geometric/blob/e18a897e8a63f78288426dc4e1574a9041d8ecae/torch_geometric/nn/conv/gcn2_conv.py)\n",
"\n",
"The graph convolutional operator with initial residual connections and\n",
" identity mapping (GCNII) from the `\"Simple and Deep Graph Convolutional\n",
" Networks\" <https://arxiv.org/abs/2007.02133>`_ paper\n",
"\n",
"$$X′ = ((1 − α)P̂X + αX(0))((1 − β)I + βΘ)$$\n",
"with $P̂ = D̂ − 1 ⁄ 2 ÂD̂ − 1 ⁄ 2 $, \n",
"where $Â = A + I $ denotes the adjacency matrix with inserted self-loops and $D̂ii = ∑j = 0Âij$ its diagonal degree matrix, and X(0) being the initial feature representation. Here, α models the strength of the initial residual connection, while β models the strength of the identity mapping. The adjacency matrix can include other values than 1 representing edge weights via the optional edge_weight tensor. "
],
"metadata": {
"id": "GE2UGP0xUEzr"
}
},
{
"cell_type": "markdown",
"source": [
"## Import Libraies"
],
"metadata": {
"id": "PM5B68YhTyNz"
}
},
{
"cell_type": "code",
"source": [
"!python -c \"import torch; print(torch.__version__)\"\n",
"!python -c \"import torch; print(torch.version.cuda)\""
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "cRq3CbXTRxiG",
"outputId": "66e42d58-63a5-4b94-b114-b74655ce8543"
},
"execution_count": 1,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"1.10.0+cu111\n",
"11.1\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"import platform"
],
"metadata": {
"id": "5DvbeehVR3Qv"
},
"execution_count": 2,
"outputs": []
},
{
"cell_type": "code",
"source": [
"print(platform.python_version())"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "phNgDPzwSLIu",
"outputId": "72b1b13b-f573-4602-a016-1de158177014"
},
"execution_count": 3,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"3.7.12\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"print(platform.system())"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "XBEy4tHCSKB4",
"outputId": "22237894-b449-430c-ea17-57530c0a95e2"
},
"execution_count": 6,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Linux\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"%%time\n",
"# Install required packages.\n",
"!pip install -q torch-scatter -f https://data.pyg.org/whl/torch-1.10.0+cu111.html\n",
"!pip install -q torch-sparse -f https://data.pyg.org/whl/torch-1.10.0+cu111.html\n",
"!pip install -q git+https://github.com/pyg-team/pytorch_geometric.git"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "aLGb7dh_RyMv",
"outputId": "a3b03de8-558e-4c41-eca4-756faa781467"
},
"execution_count": 7,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"\u001b[K |████████████████████████████████| 7.9 MB 2.5 MB/s \n",
"\u001b[K |████████████████████████████████| 3.5 MB 4.0 MB/s \n",
"\u001b[?25h Building wheel for torch-geometric (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
"CPU times: user 140 ms, sys: 41.5 ms, total: 181 ms\n",
"Wall time: 16.5 s\n"
]
}
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"id": "j5ECCjrMRjD0"
},
"outputs": [],
"source": [
"# Ref: https://github.com/pyg-team/pytorch_geometric/blob/master/examples/gcn2_cora.py\n",
"\n",
"import os.path as osp\n",
"\n",
"import torch\n",
"import torch.nn.functional as F\n",
"from torch.nn import Linear\n",
"\n",
"import torch_geometric.transforms as T\n",
"from torch_geometric.datasets import Planetoid\n",
"from torch_geometric.nn import GCN2Conv\n",
"from torch_geometric.nn.conv.gcn_conv import gcn_norm\n"
]
},
{
"cell_type": "markdown",
"source": [
"## Data"
],
"metadata": {
"id": "AVwghkinTvKM"
}
},
{
"cell_type": "code",
"source": [
"\n",
"dataset_name = 'Cora'\n",
"# path = osp.join(osp.dirname(osp.realpath(__file__)), '..', 'data', dataset)\n",
"\n",
"path = f\"/tmp/{dataset_name}\"\n",
"transform = T.Compose([T.NormalizeFeatures(), T.ToSparseTensor()])\n",
"dataset = Planetoid(path, dataset_name, transform=transform)\n",
"data = dataset[0]"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "lgtwHAaJRtuM",
"outputId": "681bd3c8-7e88-43bf-d17b-c21ffe13ba77"
},
"execution_count": 9,
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.x\n",
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.tx\n",
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.allx\n",
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.y\n",
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.ty\n",
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.ally\n",
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.graph\n",
"Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.test.index\n",
"Processing...\n",
"Done!\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"print(data)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "JbnTrlwwS2ko",
"outputId": "87ce9275-dcb4-4830-9e0d-fc91d65aeae5"
},
"execution_count": 10,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Data(x=[2708, 1433], y=[2708], train_mask=[2708], val_mask=[2708], test_mask=[2708], adj_t=[2708, 2708, nnz=10556])\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"data.adj_t = gcn_norm(data.adj_t) # Pre-process GCN normalization."
],
"metadata": {
"id": "UYPXe2AGS0f8"
},
"execution_count": 11,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## Model "
],
"metadata": {
"id": "RSeD1M-oTsOS"
}
},
{
"cell_type": "code",
"source": [
"class Net(torch.nn.Module):\n",
" def __init__(self, hidden_channels, num_layers, alpha, theta,\n",
" shared_weights=True, dropout=0.0):\n",
" super().__init__()\n",
"\n",
" self.lins = torch.nn.ModuleList()\n",
" self.lins.append(Linear(dataset.num_features, hidden_channels))\n",
" self.lins.append(Linear(hidden_channels, dataset.num_classes))\n",
"\n",
" self.convs = torch.nn.ModuleList()\n",
" for layer in range(num_layers):\n",
" self.convs.append(\n",
" GCN2Conv(hidden_channels, alpha, theta, layer + 1,\n",
" shared_weights, normalize=False))\n",
"\n",
" self.dropout = dropout\n",
"\n",
" def forward(self, x, adj_t):\n",
" x = F.dropout(x, self.dropout, training=self.training)\n",
" x = x_0 = self.lins[0](x).relu()\n",
"\n",
" for conv in self.convs:\n",
" x = F.dropout(x, self.dropout, training=self.training)\n",
" x = conv(x, x_0, adj_t)\n",
" x = x.relu()\n",
"\n",
" x = F.dropout(x, self.dropout, training=self.training)\n",
" x = self.lins[1](x)\n",
"\n",
" return x.log_softmax(dim=-1)\n",
"\n",
"\n",
"device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')\n",
"model = Net(hidden_channels=64, num_layers=64, alpha=0.1, theta=0.5,\n",
" shared_weights=True, dropout=0.6).to(device)\n",
"data = data.to(device)\n",
"optimizer = torch.optim.Adam([\n",
" dict(params=model.convs.parameters(), weight_decay=0.01),\n",
" dict(params=model.lins.parameters(), weight_decay=5e-4)\n",
"], lr=0.01)"
],
"metadata": {
"id": "kf3vJZlzRsa3"
},
"execution_count": 12,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def train():\n",
" model.train()\n",
" optimizer.zero_grad()\n",
" out = model(data.x, data.adj_t)\n",
" loss = F.nll_loss(out[data.train_mask], data.y[data.train_mask])\n",
" loss.backward()\n",
" optimizer.step()\n",
" return float(loss)\n",
"\n",
"\n",
"@torch.no_grad()\n",
"def test():\n",
" model.eval()\n",
" pred, accs = model(data.x, data.adj_t).argmax(dim=-1), []\n",
" for _, mask in data('train_mask', 'val_mask', 'test_mask'):\n",
" accs.append(int((pred[mask] == data.y[mask]).sum()) / int(mask.sum()))\n",
" return accs"
],
"metadata": {
"id": "M4fK__AQRqTA"
},
"execution_count": 13,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## Train and Evaluate"
],
"metadata": {
"id": "ORIYE357T-vR"
}
},
{
"cell_type": "code",
"source": [
"%%time\n",
"best_val_acc = test_acc = 0\n",
"for epoch in range(1, 1001):\n",
" loss = train()\n",
" train_acc, val_acc, tmp_test_acc = test()\n",
" if val_acc > best_val_acc:\n",
" best_val_acc = val_acc\n",
" test_acc = tmp_test_acc\n",
" print(f'Epoch: {epoch:04d}, Loss: {loss:.4f} Train: {train_acc:.4f}, '\n",
" f'Val: {val_acc:.4f}, Test: {tmp_test_acc:.4f}, '\n",
" f'Final Test: {test_acc:.4f}')"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "qZUT_TdsRrVV",
"outputId": "1906a0b5-82af-4aff-b8d3-880d1c24be35"
},
"execution_count": 14,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Epoch: 0001, Loss: 1.9467 Train: 0.1429, Val: 0.0720, Test: 0.0910, Final Test: 0.0910\n",
"Epoch: 0002, Loss: 1.9434 Train: 0.1429, Val: 0.0700, Test: 0.0910, Final Test: 0.0910\n",
"Epoch: 0003, Loss: 1.9438 Train: 0.1786, Val: 0.0740, Test: 0.0970, Final Test: 0.0970\n",
"Epoch: 0004, Loss: 1.9408 Train: 0.2643, Val: 0.1140, Test: 0.1190, Final Test: 0.1190\n",
"Epoch: 0005, Loss: 1.9387 Train: 0.4714, Val: 0.2020, Test: 0.2300, Final Test: 0.2300\n",
"Epoch: 0006, Loss: 1.9392 Train: 0.6357, Val: 0.4080, Test: 0.4040, Final Test: 0.4040\n",
"Epoch: 0007, Loss: 1.9371 Train: 0.5286, Val: 0.4040, Test: 0.3970, Final Test: 0.4040\n",
"Epoch: 0008, Loss: 1.9328 Train: 0.4857, Val: 0.4260, Test: 0.4100, Final Test: 0.4100\n",
"Epoch: 0009, Loss: 1.9314 Train: 0.4714, Val: 0.4180, Test: 0.4040, Final Test: 0.4100\n",
"Epoch: 0010, Loss: 1.9259 Train: 0.4929, Val: 0.4180, Test: 0.4010, Final Test: 0.4100\n",
"Epoch: 0011, Loss: 1.9246 Train: 0.5214, Val: 0.4280, Test: 0.4140, Final Test: 0.4140\n",
"Epoch: 0012, Loss: 1.9188 Train: 0.6143, Val: 0.4560, Test: 0.4340, Final Test: 0.4340\n",
"Epoch: 0013, Loss: 1.9134 Train: 0.7071, Val: 0.5020, Test: 0.4950, Final Test: 0.4950\n",
"Epoch: 0014, Loss: 1.9164 Train: 0.8071, Val: 0.6040, Test: 0.6150, Final Test: 0.6150\n",
"Epoch: 0015, Loss: 1.9123 Train: 0.8571, Val: 0.6900, Test: 0.7040, Final Test: 0.7040\n",
"Epoch: 0016, Loss: 1.9056 Train: 0.8929, Val: 0.7580, Test: 0.7780, Final Test: 0.7780\n",
"Epoch: 0017, Loss: 1.8996 Train: 0.9286, Val: 0.7800, Test: 0.7990, Final Test: 0.7990\n",
"Epoch: 0018, Loss: 1.8857 Train: 0.9214, Val: 0.7740, Test: 0.8000, Final Test: 0.7990\n",
"Epoch: 0019, Loss: 1.8796 Train: 0.9214, Val: 0.7760, Test: 0.7950, Final Test: 0.7990\n",
"Epoch: 0020, Loss: 1.8900 Train: 0.9286, Val: 0.7840, Test: 0.8000, Final Test: 0.8000\n",
"Epoch: 0021, Loss: 1.8659 Train: 0.9357, Val: 0.7780, Test: 0.7920, Final Test: 0.8000\n",
"Epoch: 0022, Loss: 1.8812 Train: 0.9286, Val: 0.7840, Test: 0.7990, Final Test: 0.8000\n",
"Epoch: 0023, Loss: 1.8749 Train: 0.9286, Val: 0.7860, Test: 0.8130, Final Test: 0.8130\n",
"Epoch: 0024, Loss: 1.8777 Train: 0.9357, Val: 0.7900, Test: 0.8150, Final Test: 0.8150\n",
"Epoch: 0025, Loss: 1.8620 Train: 0.9357, Val: 0.7900, Test: 0.8280, Final Test: 0.8150\n",
"Epoch: 0026, Loss: 1.8413 Train: 0.9286, Val: 0.7880, Test: 0.8290, Final Test: 0.8150\n",
"Epoch: 0027, Loss: 1.8373 Train: 0.9286, Val: 0.7860, Test: 0.8290, Final Test: 0.8150\n",
"Epoch: 0028, Loss: 1.8336 Train: 0.9357, Val: 0.7800, Test: 0.8160, Final Test: 0.8150\n",
"Epoch: 0029, Loss: 1.8526 Train: 0.9071, Val: 0.7620, Test: 0.7730, Final Test: 0.8150\n",
"Epoch: 0030, Loss: 1.8254 Train: 0.8929, Val: 0.7200, Test: 0.7340, Final Test: 0.8150\n",
"Epoch: 0031, Loss: 1.7921 Train: 0.8714, Val: 0.6580, Test: 0.6780, Final Test: 0.8150\n",
"Epoch: 0032, Loss: 1.8169 Train: 0.8571, Val: 0.6360, Test: 0.6380, Final Test: 0.8150\n",
"Epoch: 0033, Loss: 1.7966 Train: 0.8357, Val: 0.6080, Test: 0.6210, Final Test: 0.8150\n",
"Epoch: 0034, Loss: 1.7743 Train: 0.8429, Val: 0.6060, Test: 0.6180, Final Test: 0.8150\n",
"Epoch: 0035, Loss: 1.8025 Train: 0.8429, Val: 0.6200, Test: 0.6220, Final Test: 0.8150\n",
"Epoch: 0036, Loss: 1.7687 Train: 0.8571, Val: 0.6280, Test: 0.6370, Final Test: 0.8150\n",
"Epoch: 0037, Loss: 1.8168 Train: 0.8786, Val: 0.6600, Test: 0.6850, Final Test: 0.8150\n",
"Epoch: 0038, Loss: 1.7781 Train: 0.9000, Val: 0.7040, Test: 0.7240, Final Test: 0.8150\n",
"Epoch: 0039, Loss: 1.7381 Train: 0.9000, Val: 0.7240, Test: 0.7450, Final Test: 0.8150\n",
"Epoch: 0040, Loss: 1.7527 Train: 0.9071, Val: 0.7360, Test: 0.7550, Final Test: 0.8150\n",
"Epoch: 0041, Loss: 1.7345 Train: 0.9143, Val: 0.7440, Test: 0.7670, Final Test: 0.8150\n",
"Epoch: 0042, Loss: 1.7951 Train: 0.9071, Val: 0.7460, Test: 0.7740, Final Test: 0.8150\n",
"Epoch: 0043, Loss: 1.6967 Train: 0.9071, Val: 0.7520, Test: 0.7770, Final Test: 0.8150\n",
"Epoch: 0044, Loss: 1.7267 Train: 0.9143, Val: 0.7740, Test: 0.7820, Final Test: 0.8150\n",
"Epoch: 0045, Loss: 1.7151 Train: 0.9143, Val: 0.7800, Test: 0.7930, Final Test: 0.8150\n",
"Epoch: 0046, Loss: 1.7346 Train: 0.9143, Val: 0.8000, Test: 0.8070, Final Test: 0.8070\n",
"Epoch: 0047, Loss: 1.6701 Train: 0.9071, Val: 0.8020, Test: 0.8100, Final Test: 0.8100\n",
"Epoch: 0048, Loss: 1.6680 Train: 0.9143, Val: 0.8120, Test: 0.8160, Final Test: 0.8160\n",
"Epoch: 0049, Loss: 1.6866 Train: 0.9286, Val: 0.8000, Test: 0.8190, Final Test: 0.8160\n",
"Epoch: 0050, Loss: 1.9452 Train: 0.9286, Val: 0.8020, Test: 0.8260, Final Test: 0.8160\n",
"Epoch: 0051, Loss: 1.5844 Train: 0.9214, Val: 0.8060, Test: 0.8260, Final Test: 0.8160\n",
"Epoch: 0052, Loss: 1.5701 Train: 0.9214, Val: 0.8100, Test: 0.8290, Final Test: 0.8160\n",
"Epoch: 0053, Loss: 1.6297 Train: 0.9214, Val: 0.7980, Test: 0.8200, Final Test: 0.8160\n",
"Epoch: 0054, Loss: 1.6528 Train: 0.9357, Val: 0.7860, Test: 0.8160, Final Test: 0.8160\n",
"Epoch: 0055, Loss: 1.6172 Train: 0.9286, Val: 0.7860, Test: 0.8130, Final Test: 0.8160\n",
"Epoch: 0056, Loss: 1.5894 Train: 0.9286, Val: 0.7820, Test: 0.8070, Final Test: 0.8160\n",
"Epoch: 0057, Loss: 1.5703 Train: 0.9214, Val: 0.7700, Test: 0.7900, Final Test: 0.8160\n",
"Epoch: 0058, Loss: 1.5349 Train: 0.9143, Val: 0.7620, Test: 0.7820, Final Test: 0.8160\n",
"Epoch: 0059, Loss: 1.5621 Train: 0.9143, Val: 0.7540, Test: 0.7760, Final Test: 0.8160\n",
"Epoch: 0060, Loss: 1.5115 Train: 0.9214, Val: 0.7620, Test: 0.7820, Final Test: 0.8160\n",
"Epoch: 0061, Loss: 1.5652 Train: 0.9214, Val: 0.7580, Test: 0.7850, Final Test: 0.8160\n",
"Epoch: 0062, Loss: 1.5814 Train: 0.9214, Val: 0.7700, Test: 0.7880, Final Test: 0.8160\n",
"Epoch: 0063, Loss: 1.5340 Train: 0.9286, Val: 0.7800, Test: 0.7950, Final Test: 0.8160\n",
"Epoch: 0064, Loss: 1.4907 Train: 0.9214, Val: 0.7840, Test: 0.8040, Final Test: 0.8160\n",
"Epoch: 0065, Loss: 1.5125 Train: 0.9286, Val: 0.7940, Test: 0.8080, Final Test: 0.8160\n",
"Epoch: 0066, Loss: 1.5077 Train: 0.9214, Val: 0.8000, Test: 0.8150, Final Test: 0.8160\n",
"Epoch: 0067, Loss: 1.4817 Train: 0.9214, Val: 0.8020, Test: 0.8210, Final Test: 0.8160\n",
"Epoch: 0068, Loss: 1.5050 Train: 0.9214, Val: 0.8020, Test: 0.8220, Final Test: 0.8160\n",
"Epoch: 0069, Loss: 1.5290 Train: 0.9214, Val: 0.8020, Test: 0.8210, Final Test: 0.8160\n",
"Epoch: 0070, Loss: 1.4782 Train: 0.9214, Val: 0.8040, Test: 0.8220, Final Test: 0.8160\n",
"Epoch: 0071, Loss: 1.4475 Train: 0.9214, Val: 0.8040, Test: 0.8240, Final Test: 0.8160\n",
"Epoch: 0072, Loss: 1.5063 Train: 0.9214, Val: 0.8040, Test: 0.8280, Final Test: 0.8160\n",
"Epoch: 0073, Loss: 1.4627 Train: 0.9143, Val: 0.8000, Test: 0.8300, Final Test: 0.8160\n",
"Epoch: 0074, Loss: 1.4214 Train: 0.9214, Val: 0.7900, Test: 0.8230, Final Test: 0.8160\n",
"Epoch: 0075, Loss: 1.4920 Train: 0.9286, Val: 0.7860, Test: 0.8200, Final Test: 0.8160\n",
"Epoch: 0076, Loss: 1.4716 Train: 0.9214, Val: 0.7840, Test: 0.8110, Final Test: 0.8160\n",
"Epoch: 0077, Loss: 1.3362 Train: 0.9214, Val: 0.7820, Test: 0.8030, Final Test: 0.8160\n",
"Epoch: 0078, Loss: 1.4022 Train: 0.9214, Val: 0.7780, Test: 0.7940, Final Test: 0.8160\n",
"Epoch: 0079, Loss: 1.4305 Train: 0.9286, Val: 0.7760, Test: 0.7920, Final Test: 0.8160\n",
"Epoch: 0080, Loss: 1.3740 Train: 0.9286, Val: 0.7760, Test: 0.7910, Final Test: 0.8160\n",
"Epoch: 0081, Loss: 1.3107 Train: 0.9214, Val: 0.7740, Test: 0.7880, Final Test: 0.8160\n",
"Epoch: 0082, Loss: 1.3553 Train: 0.9214, Val: 0.7780, Test: 0.7940, Final Test: 0.8160\n",
"Epoch: 0083, Loss: 1.3393 Train: 0.9286, Val: 0.7800, Test: 0.8050, Final Test: 0.8160\n",
"Epoch: 0084, Loss: 1.4097 Train: 0.9286, Val: 0.7860, Test: 0.8190, Final Test: 0.8160\n",
"Epoch: 0085, Loss: 1.3974 Train: 0.9357, Val: 0.7860, Test: 0.8260, Final Test: 0.8160\n",
"Epoch: 0086, Loss: 1.3414 Train: 0.9286, Val: 0.8000, Test: 0.8320, Final Test: 0.8160\n",
"Epoch: 0087, Loss: 1.3714 Train: 0.9357, Val: 0.7960, Test: 0.8310, Final Test: 0.8160\n",
"Epoch: 0088, Loss: 1.2895 Train: 0.9357, Val: 0.8020, Test: 0.8300, Final Test: 0.8160\n",
"Epoch: 0089, Loss: 1.2676 Train: 0.9429, Val: 0.8040, Test: 0.8300, Final Test: 0.8160\n",
"Epoch: 0090, Loss: 1.3662 Train: 0.9429, Val: 0.8000, Test: 0.8220, Final Test: 0.8160\n",
"Epoch: 0091, Loss: 1.3708 Train: 0.9357, Val: 0.7980, Test: 0.8250, Final Test: 0.8160\n",
"Epoch: 0092, Loss: 1.3161 Train: 0.9357, Val: 0.8000, Test: 0.8300, Final Test: 0.8160\n",
"Epoch: 0093, Loss: 1.3378 Train: 0.9286, Val: 0.8040, Test: 0.8310, Final Test: 0.8160\n",
"Epoch: 0094, Loss: 1.2848 Train: 0.9214, Val: 0.8120, Test: 0.8320, Final Test: 0.8160\n",
"Epoch: 0095, Loss: 1.2813 Train: 0.9286, Val: 0.8100, Test: 0.8330, Final Test: 0.8160\n",
"Epoch: 0096, Loss: 1.3304 Train: 0.9286, Val: 0.8080, Test: 0.8340, Final Test: 0.8160\n",
"Epoch: 0097, Loss: 1.2391 Train: 0.9357, Val: 0.8040, Test: 0.8380, Final Test: 0.8160\n",
"Epoch: 0098, Loss: 1.3002 Train: 0.9357, Val: 0.8080, Test: 0.8410, Final Test: 0.8160\n",
"Epoch: 0099, Loss: 1.2699 Train: 0.9357, Val: 0.8120, Test: 0.8400, Final Test: 0.8160\n",
"Epoch: 0100, Loss: 1.2380 Train: 0.9357, Val: 0.8120, Test: 0.8410, Final Test: 0.8160\n",
"Epoch: 0101, Loss: 1.2608 Train: 0.9357, Val: 0.8120, Test: 0.8390, Final Test: 0.8160\n",
"Epoch: 0102, Loss: 1.3270 Train: 0.9357, Val: 0.8120, Test: 0.8350, Final Test: 0.8160\n",
"Epoch: 0103, Loss: 1.2458 Train: 0.9357, Val: 0.8060, Test: 0.8320, Final Test: 0.8160\n",
"Epoch: 0104, Loss: 1.2871 Train: 0.9429, Val: 0.8080, Test: 0.8320, Final Test: 0.8160\n",
"Epoch: 0105, Loss: 1.2503 Train: 0.9429, Val: 0.8060, Test: 0.8270, Final Test: 0.8160\n",
"Epoch: 0106, Loss: 1.2537 Train: 0.9357, Val: 0.8060, Test: 0.8240, Final Test: 0.8160\n",
"Epoch: 0107, Loss: 1.1699 Train: 0.9357, Val: 0.8060, Test: 0.8200, Final Test: 0.8160\n",
"Epoch: 0108, Loss: 1.2063 Train: 0.9357, Val: 0.8040, Test: 0.8170, Final Test: 0.8160\n",
"Epoch: 0109, Loss: 1.2961 Train: 0.9357, Val: 0.8040, Test: 0.8200, Final Test: 0.8160\n",
"Epoch: 0110, Loss: 1.2194 Train: 0.9357, Val: 0.8060, Test: 0.8250, Final Test: 0.8160\n",
"Epoch: 0111, Loss: 1.2011 Train: 0.9357, Val: 0.8040, Test: 0.8260, Final Test: 0.8160\n",
"Epoch: 0112, Loss: 1.2678 Train: 0.9429, Val: 0.8120, Test: 0.8350, Final Test: 0.8160\n",
"Epoch: 0113, Loss: 1.2066 Train: 0.9357, Val: 0.8100, Test: 0.8330, Final Test: 0.8160\n",
"Epoch: 0114, Loss: 1.1874 Train: 0.9357, Val: 0.8120, Test: 0.8350, Final Test: 0.8160\n",
"Epoch: 0115, Loss: 1.2112 Train: 0.9357, Val: 0.8080, Test: 0.8350, Final Test: 0.8160\n",
"Epoch: 0116, Loss: 1.2332 Train: 0.9286, Val: 0.8060, Test: 0.8340, Final Test: 0.8160\n",
"Epoch: 0117, Loss: 1.1332 Train: 0.9286, Val: 0.8080, Test: 0.8330, Final Test: 0.8160\n",
"Epoch: 0118, Loss: 1.1985 Train: 0.9357, Val: 0.8120, Test: 0.8300, Final Test: 0.8160\n",
"Epoch: 0119, Loss: 1.1034 Train: 0.9286, Val: 0.8100, Test: 0.8280, Final Test: 0.8160\n",
"Epoch: 0120, Loss: 1.2234 Train: 0.9286, Val: 0.8040, Test: 0.8300, Final Test: 0.8160\n",
"Epoch: 0121, Loss: 1.1704 Train: 0.9286, Val: 0.8080, Test: 0.8340, Final Test: 0.8160\n",
"Epoch: 0122, Loss: 1.3507 Train: 0.9286, Val: 0.8120, Test: 0.8390, Final Test: 0.8160\n",
"Epoch: 0123, Loss: 1.1011 Train: 0.9286, Val: 0.8120, Test: 0.8400, Final Test: 0.8160\n",
"Epoch: 0124, Loss: 1.1917 Train: 0.9286, Val: 0.8140, Test: 0.8370, Final Test: 0.8370\n",
"Epoch: 0125, Loss: 1.2025 Train: 0.9286, Val: 0.8120, Test: 0.8360, Final Test: 0.8370\n",
"Epoch: 0126, Loss: 1.0910 Train: 0.9286, Val: 0.8060, Test: 0.8350, Final Test: 0.8370\n",
"Epoch: 0127, Loss: 1.0832 Train: 0.9429, Val: 0.8040, Test: 0.8370, Final Test: 0.8370\n",
"Epoch: 0128, Loss: 1.1303 Train: 0.9429, Val: 0.8100, Test: 0.8350, Final Test: 0.8370\n",
"Epoch: 0129, Loss: 1.2324 Train: 0.9429, Val: 0.8080, Test: 0.8340, Final Test: 0.8370\n",
"Epoch: 0130, Loss: 1.0995 Train: 0.9500, Val: 0.8060, Test: 0.8360, Final Test: 0.8370\n",
"Epoch: 0131, Loss: 1.1813 Train: 0.9571, Val: 0.8040, Test: 0.8340, Final Test: 0.8370\n",
"Epoch: 0132, Loss: 1.1694 Train: 0.9500, Val: 0.8060, Test: 0.8420, Final Test: 0.8370\n",
"Epoch: 0133, Loss: 1.1377 Train: 0.9500, Val: 0.8080, Test: 0.8430, Final Test: 0.8370\n",
"Epoch: 0134, Loss: 1.1486 Train: 0.9500, Val: 0.8120, Test: 0.8470, Final Test: 0.8370\n",
"Epoch: 0135, Loss: 1.1842 Train: 0.9429, Val: 0.8080, Test: 0.8470, Final Test: 0.8370\n",
"Epoch: 0136, Loss: 1.1524 Train: 0.9500, Val: 0.8120, Test: 0.8430, Final Test: 0.8370\n",
"Epoch: 0137, Loss: 1.1453 Train: 0.9500, Val: 0.8100, Test: 0.8380, Final Test: 0.8370\n",
"Epoch: 0138, Loss: 1.2056 Train: 0.9643, Val: 0.8040, Test: 0.8370, Final Test: 0.8370\n",
"Epoch: 0139, Loss: 1.1291 Train: 0.9571, Val: 0.8080, Test: 0.8320, Final Test: 0.8370\n",
"Epoch: 0140, Loss: 1.1361 Train: 0.9429, Val: 0.8080, Test: 0.8280, Final Test: 0.8370\n",
"Epoch: 0141, Loss: 1.1212 Train: 0.9429, Val: 0.7980, Test: 0.8270, Final Test: 0.8370\n",
"Epoch: 0142, Loss: 1.1521 Train: 0.9357, Val: 0.7980, Test: 0.8220, Final Test: 0.8370\n",
"Epoch: 0143, Loss: 1.0984 Train: 0.9429, Val: 0.7960, Test: 0.8220, Final Test: 0.8370\n",
"Epoch: 0144, Loss: 1.0836 Train: 0.9429, Val: 0.7960, Test: 0.8210, Final Test: 0.8370\n",
"Epoch: 0145, Loss: 1.0955 Train: 0.9429, Val: 0.7980, Test: 0.8200, Final Test: 0.8370\n",
"Epoch: 0146, Loss: 1.0361 Train: 0.9429, Val: 0.8060, Test: 0.8240, Final Test: 0.8370\n",
"Epoch: 0147, Loss: 1.1376 Train: 0.9429, Val: 0.8080, Test: 0.8320, Final Test: 0.8370\n",
"Epoch: 0148, Loss: 1.0928 Train: 0.9357, Val: 0.8080, Test: 0.8400, Final Test: 0.8370\n",
"Epoch: 0149, Loss: 1.1654 Train: 0.9357, Val: 0.8060, Test: 0.8390, Final Test: 0.8370\n",
"Epoch: 0150, Loss: 1.1203 Train: 0.9500, Val: 0.8040, Test: 0.8410, Final Test: 0.8370\n",
"Epoch: 0151, Loss: 1.1612 Train: 0.9500, Val: 0.8000, Test: 0.8390, Final Test: 0.8370\n",
"Epoch: 0152, Loss: 1.1225 Train: 0.9500, Val: 0.7980, Test: 0.8370, Final Test: 0.8370\n",
"Epoch: 0153, Loss: 1.0554 Train: 0.9500, Val: 0.8040, Test: 0.8330, Final Test: 0.8370\n",
"Epoch: 0154, Loss: 1.1232 Train: 0.9500, Val: 0.8020, Test: 0.8310, Final Test: 0.8370\n",
"Epoch: 0155, Loss: 1.0364 Train: 0.9429, Val: 0.8060, Test: 0.8310, Final Test: 0.8370\n",
"Epoch: 0156, Loss: 1.0133 Train: 0.9429, Val: 0.8080, Test: 0.8330, Final Test: 0.8370\n",
"Epoch: 0157, Loss: 1.1166 Train: 0.9500, Val: 0.8040, Test: 0.8360, Final Test: 0.8370\n",
"Epoch: 0158, Loss: 1.0719 Train: 0.9571, Val: 0.8060, Test: 0.8410, Final Test: 0.8370\n",
"Epoch: 0159, Loss: 1.0971 Train: 0.9571, Val: 0.8060, Test: 0.8460, Final Test: 0.8370\n",
"Epoch: 0160, Loss: 0.9990 Train: 0.9571, Val: 0.8000, Test: 0.8430, Final Test: 0.8370\n",
"Epoch: 0161, Loss: 1.0392 Train: 0.9429, Val: 0.8140, Test: 0.8410, Final Test: 0.8370\n",
"Epoch: 0162, Loss: 1.0615 Train: 0.9429, Val: 0.8180, Test: 0.8350, Final Test: 0.8350\n",
"Epoch: 0163, Loss: 1.1055 Train: 0.9429, Val: 0.8160, Test: 0.8320, Final Test: 0.8350\n",
"Epoch: 0164, Loss: 1.0703 Train: 0.9357, Val: 0.8140, Test: 0.8250, Final Test: 0.8350\n",
"Epoch: 0165, Loss: 1.0657 Train: 0.9357, Val: 0.8100, Test: 0.8260, Final Test: 0.8350\n",
"Epoch: 0166, Loss: 1.0047 Train: 0.9429, Val: 0.8120, Test: 0.8250, Final Test: 0.8350\n",
"Epoch: 0167, Loss: 1.1298 Train: 0.9429, Val: 0.8140, Test: 0.8250, Final Test: 0.8350\n",
"Epoch: 0168, Loss: 1.1080 Train: 0.9500, Val: 0.8120, Test: 0.8250, Final Test: 0.8350\n",
"Epoch: 0169, Loss: 1.0947 Train: 0.9571, Val: 0.8160, Test: 0.8270, Final Test: 0.8350\n",
"Epoch: 0170, Loss: 0.9786 Train: 0.9500, Val: 0.8160, Test: 0.8300, Final Test: 0.8350\n",
"Epoch: 0171, Loss: 1.0583 Train: 0.9500, Val: 0.8180, Test: 0.8370, Final Test: 0.8350\n",
"Epoch: 0172, Loss: 1.0910 Train: 0.9643, Val: 0.8160, Test: 0.8400, Final Test: 0.8350\n",
"Epoch: 0173, Loss: 1.0850 Train: 0.9643, Val: 0.8120, Test: 0.8440, Final Test: 0.8350\n",
"Epoch: 0174, Loss: 1.0299 Train: 0.9571, Val: 0.8120, Test: 0.8450, Final Test: 0.8350\n",
"Epoch: 0175, Loss: 0.9622 Train: 0.9571, Val: 0.8180, Test: 0.8470, Final Test: 0.8350\n",
"Epoch: 0176, Loss: 1.0259 Train: 0.9643, Val: 0.8180, Test: 0.8480, Final Test: 0.8350\n",
"Epoch: 0177, Loss: 1.1419 Train: 0.9643, Val: 0.8180, Test: 0.8460, Final Test: 0.8350\n",
"Epoch: 0178, Loss: 1.0607 Train: 0.9571, Val: 0.8120, Test: 0.8480, Final Test: 0.8350\n",
"Epoch: 0179, Loss: 0.9937 Train: 0.9571, Val: 0.8080, Test: 0.8410, Final Test: 0.8350\n",
"Epoch: 0180, Loss: 1.0018 Train: 0.9500, Val: 0.8020, Test: 0.8440, Final Test: 0.8350\n",
"Epoch: 0181, Loss: 0.9408 Train: 0.9500, Val: 0.8080, Test: 0.8400, Final Test: 0.8350\n",
"Epoch: 0182, Loss: 1.0836 Train: 0.9429, Val: 0.8060, Test: 0.8360, Final Test: 0.8350\n",
"Epoch: 0183, Loss: 0.9932 Train: 0.9429, Val: 0.8080, Test: 0.8340, Final Test: 0.8350\n",
"Epoch: 0184, Loss: 1.0171 Train: 0.9429, Val: 0.8120, Test: 0.8350, Final Test: 0.8350\n",
"Epoch: 0185, Loss: 1.0137 Train: 0.9429, Val: 0.8120, Test: 0.8310, Final Test: 0.8350\n",
"Epoch: 0186, Loss: 1.0509 Train: 0.9429, Val: 0.8100, Test: 0.8260, Final Test: 0.8350\n",
"Epoch: 0187, Loss: 1.0186 Train: 0.9429, Val: 0.8080, Test: 0.8290, Final Test: 0.8350\n",
"Epoch: 0188, Loss: 1.0322 Train: 0.9357, Val: 0.8080, Test: 0.8270, Final Test: 0.8350\n",
"Epoch: 0189, Loss: 1.0844 Train: 0.9429, Val: 0.8120, Test: 0.8270, Final Test: 0.8350\n",
"Epoch: 0190, Loss: 0.9555 Train: 0.9429, Val: 0.8140, Test: 0.8240, Final Test: 0.8350\n",
"Epoch: 0191, Loss: 1.0007 Train: 0.9429, Val: 0.8140, Test: 0.8300, Final Test: 0.8350\n",
"Epoch: 0192, Loss: 1.0208 Train: 0.9429, Val: 0.8120, Test: 0.8310, Final Test: 0.8350\n",
"Epoch: 0193, Loss: 1.0260 Train: 0.9429, Val: 0.8080, Test: 0.8320, Final Test: 0.8350\n",
"Epoch: 0194, Loss: 1.0353 Train: 0.9500, Val: 0.8100, Test: 0.8340, Final Test: 0.8350\n",
"Epoch: 0195, Loss: 0.9907 Train: 0.9571, Val: 0.8080, Test: 0.8340, Final Test: 0.8350\n",
"Epoch: 0196, Loss: 1.0123 Train: 0.9571, Val: 0.8060, Test: 0.8360, Final Test: 0.8350\n",
"Epoch: 0197, Loss: 0.9784 Train: 0.9571, Val: 0.8060, Test: 0.8410, Final Test: 0.8350\n",
"Epoch: 0198, Loss: 1.0541 Train: 0.9571, Val: 0.8040, Test: 0.8420, Final Test: 0.8350\n",
"Epoch: 0199, Loss: 0.9980 Train: 0.9571, Val: 0.8120, Test: 0.8450, Final Test: 0.8350\n",
"Epoch: 0200, Loss: 1.0216 Train: 0.9500, Val: 0.8120, Test: 0.8450, Final Test: 0.8350\n",
"Epoch: 0201, Loss: 1.0475 Train: 0.9500, Val: 0.8120, Test: 0.8480, Final Test: 0.8350\n",
"Epoch: 0202, Loss: 1.0101 Train: 0.9500, Val: 0.8160, Test: 0.8440, Final Test: 0.8350\n",
"Epoch: 0203, Loss: 1.2339 Train: 0.9429, Val: 0.8200, Test: 0.8330, Final Test: 0.8330\n",
"Epoch: 0204, Loss: 0.9972 Train: 0.9429, Val: 0.8180, Test: 0.8340, Final Test: 0.8330\n",
"Epoch: 0205, Loss: 1.0353 Train: 0.9571, Val: 0.8140, Test: 0.8310, Final Test: 0.8330\n",
"Epoch: 0206, Loss: 1.0271 Train: 0.9500, Val: 0.8100, Test: 0.8290, Final Test: 0.8330\n",
"Epoch: 0207, Loss: 0.9851 Train: 0.9500, Val: 0.8100, Test: 0.8320, Final Test: 0.8330\n",
"Epoch: 0208, Loss: 0.9429 Train: 0.9500, Val: 0.8140, Test: 0.8350, Final Test: 0.8330\n",
"Epoch: 0209, Loss: 0.9501 Train: 0.9500, Val: 0.8180, Test: 0.8370, Final Test: 0.8330\n",
"Epoch: 0210, Loss: 1.1267 Train: 0.9500, Val: 0.8180, Test: 0.8420, Final Test: 0.8330\n",
"Epoch: 0211, Loss: 1.0375 Train: 0.9500, Val: 0.8180, Test: 0.8480, Final Test: 0.8330\n",
"Epoch: 0212, Loss: 0.9986 Train: 0.9500, Val: 0.8220, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0213, Loss: 1.1109 Train: 0.9571, Val: 0.8240, Test: 0.8440, Final Test: 0.8440\n",
"Epoch: 0214, Loss: 1.1928 Train: 0.9571, Val: 0.8180, Test: 0.8470, Final Test: 0.8440\n",
"Epoch: 0215, Loss: 1.0493 Train: 0.9571, Val: 0.8160, Test: 0.8470, Final Test: 0.8440\n",
"Epoch: 0216, Loss: 0.9264 Train: 0.9429, Val: 0.8060, Test: 0.8430, Final Test: 0.8440\n",
"Epoch: 0217, Loss: 0.9971 Train: 0.9357, Val: 0.7960, Test: 0.8400, Final Test: 0.8440\n",
"Epoch: 0218, Loss: 0.9280 Train: 0.9429, Val: 0.7980, Test: 0.8400, Final Test: 0.8440\n",
"Epoch: 0219, Loss: 1.0218 Train: 0.9429, Val: 0.8020, Test: 0.8340, Final Test: 0.8440\n",
"Epoch: 0220, Loss: 0.9961 Train: 0.9429, Val: 0.8060, Test: 0.8390, Final Test: 0.8440\n",
"Epoch: 0221, Loss: 0.8922 Train: 0.9500, Val: 0.8040, Test: 0.8380, Final Test: 0.8440\n",
"Epoch: 0222, Loss: 1.3371 Train: 0.9571, Val: 0.8080, Test: 0.8350, Final Test: 0.8440\n",
"Epoch: 0223, Loss: 0.9842 Train: 0.9571, Val: 0.8100, Test: 0.8370, Final Test: 0.8440\n",
"Epoch: 0224, Loss: 0.9730 Train: 0.9571, Val: 0.8140, Test: 0.8380, Final Test: 0.8440\n",
"Epoch: 0225, Loss: 1.0050 Train: 0.9571, Val: 0.8200, Test: 0.8350, Final Test: 0.8440\n",
"Epoch: 0226, Loss: 0.9376 Train: 0.9571, Val: 0.8220, Test: 0.8380, Final Test: 0.8440\n",
"Epoch: 0227, Loss: 0.9561 Train: 0.9714, Val: 0.8140, Test: 0.8410, Final Test: 0.8440\n",
"Epoch: 0228, Loss: 1.0561 Train: 0.9714, Val: 0.8120, Test: 0.8450, Final Test: 0.8440\n",
"Epoch: 0229, Loss: 0.9139 Train: 0.9714, Val: 0.8160, Test: 0.8530, Final Test: 0.8440\n",
"Epoch: 0230, Loss: 1.0523 Train: 0.9714, Val: 0.8140, Test: 0.8570, Final Test: 0.8440\n",
"Epoch: 0231, Loss: 1.1802 Train: 0.9714, Val: 0.8160, Test: 0.8570, Final Test: 0.8440\n",
"Epoch: 0232, Loss: 1.0045 Train: 0.9714, Val: 0.8160, Test: 0.8560, Final Test: 0.8440\n",
"Epoch: 0233, Loss: 0.9725 Train: 0.9714, Val: 0.8160, Test: 0.8600, Final Test: 0.8440\n",
"Epoch: 0234, Loss: 0.9334 Train: 0.9714, Val: 0.8200, Test: 0.8570, Final Test: 0.8440\n",
"Epoch: 0235, Loss: 0.9252 Train: 0.9714, Val: 0.8200, Test: 0.8550, Final Test: 0.8440\n",
"Epoch: 0236, Loss: 0.9016 Train: 0.9714, Val: 0.8240, Test: 0.8510, Final Test: 0.8440\n",
"Epoch: 0237, Loss: 0.9425 Train: 0.9714, Val: 0.8300, Test: 0.8430, Final Test: 0.8430\n",
"Epoch: 0238, Loss: 1.0282 Train: 0.9714, Val: 0.8300, Test: 0.8420, Final Test: 0.8430\n",
"Epoch: 0239, Loss: 0.9524 Train: 0.9643, Val: 0.8140, Test: 0.8370, Final Test: 0.8430\n",
"Epoch: 0240, Loss: 0.8939 Train: 0.9571, Val: 0.8160, Test: 0.8370, Final Test: 0.8430\n",
"Epoch: 0241, Loss: 0.9873 Train: 0.9571, Val: 0.8180, Test: 0.8340, Final Test: 0.8430\n",
"Epoch: 0242, Loss: 0.9083 Train: 0.9571, Val: 0.8200, Test: 0.8340, Final Test: 0.8430\n",
"Epoch: 0243, Loss: 0.9001 Train: 0.9571, Val: 0.8220, Test: 0.8370, Final Test: 0.8430\n",
"Epoch: 0244, Loss: 0.9436 Train: 0.9643, Val: 0.8140, Test: 0.8370, Final Test: 0.8430\n",
"Epoch: 0245, Loss: 0.8496 Train: 0.9714, Val: 0.8100, Test: 0.8410, Final Test: 0.8430\n",
"Epoch: 0246, Loss: 1.4432 Train: 0.9714, Val: 0.8140, Test: 0.8430, Final Test: 0.8430\n",
"Epoch: 0247, Loss: 1.0360 Train: 0.9643, Val: 0.8100, Test: 0.8470, Final Test: 0.8430\n",
"Epoch: 0248, Loss: 0.8782 Train: 0.9714, Val: 0.8100, Test: 0.8510, Final Test: 0.8430\n",
"Epoch: 0249, Loss: 1.0033 Train: 0.9714, Val: 0.8120, Test: 0.8510, Final Test: 0.8430\n",
"Epoch: 0250, Loss: 1.0256 Train: 0.9714, Val: 0.8160, Test: 0.8430, Final Test: 0.8430\n",
"Epoch: 0251, Loss: 0.9763 Train: 0.9643, Val: 0.8180, Test: 0.8390, Final Test: 0.8430\n",
"Epoch: 0252, Loss: 0.9148 Train: 0.9571, Val: 0.8160, Test: 0.8400, Final Test: 0.8430\n",
"Epoch: 0253, Loss: 1.0270 Train: 0.9500, Val: 0.8160, Test: 0.8370, Final Test: 0.8430\n",
"Epoch: 0254, Loss: 0.9703 Train: 0.9500, Val: 0.8120, Test: 0.8340, Final Test: 0.8430\n",
"Epoch: 0255, Loss: 0.9336 Train: 0.9500, Val: 0.8040, Test: 0.8320, Final Test: 0.8430\n",
"Epoch: 0256, Loss: 0.8379 Train: 0.9500, Val: 0.8040, Test: 0.8330, Final Test: 0.8430\n",
"Epoch: 0257, Loss: 0.9327 Train: 0.9500, Val: 0.8040, Test: 0.8320, Final Test: 0.8430\n",
"Epoch: 0258, Loss: 0.9649 Train: 0.9500, Val: 0.8020, Test: 0.8310, Final Test: 0.8430\n",
"Epoch: 0259, Loss: 0.8717 Train: 0.9500, Val: 0.8040, Test: 0.8320, Final Test: 0.8430\n",
"Epoch: 0260, Loss: 0.8763 Train: 0.9571, Val: 0.8060, Test: 0.8360, Final Test: 0.8430\n",
"Epoch: 0261, Loss: 0.8919 Train: 0.9643, Val: 0.8060, Test: 0.8410, Final Test: 0.8430\n",
"Epoch: 0262, Loss: 0.8860 Train: 0.9643, Val: 0.8080, Test: 0.8430, Final Test: 0.8430\n",
"Epoch: 0263, Loss: 0.9011 Train: 0.9643, Val: 0.8140, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0264, Loss: 0.9433 Train: 0.9643, Val: 0.8180, Test: 0.8470, Final Test: 0.8430\n",
"Epoch: 0265, Loss: 0.9170 Train: 0.9714, Val: 0.8200, Test: 0.8490, Final Test: 0.8430\n",
"Epoch: 0266, Loss: 0.9294 Train: 0.9714, Val: 0.8200, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0267, Loss: 0.9229 Train: 0.9714, Val: 0.8200, Test: 0.8430, Final Test: 0.8430\n",
"Epoch: 0268, Loss: 0.8923 Train: 0.9714, Val: 0.8200, Test: 0.8420, Final Test: 0.8430\n",
"Epoch: 0269, Loss: 0.8947 Train: 0.9714, Val: 0.8200, Test: 0.8430, Final Test: 0.8430\n",
"Epoch: 0270, Loss: 0.8803 Train: 0.9714, Val: 0.8220, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0271, Loss: 0.9180 Train: 0.9714, Val: 0.8200, Test: 0.8470, Final Test: 0.8430\n",
"Epoch: 0272, Loss: 0.8690 Train: 0.9714, Val: 0.8180, Test: 0.8460, Final Test: 0.8430\n",
"Epoch: 0273, Loss: 0.9811 Train: 0.9714, Val: 0.8160, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0274, Loss: 0.9225 Train: 0.9714, Val: 0.8140, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0275, Loss: 0.9615 Train: 0.9714, Val: 0.8160, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0276, Loss: 0.9357 Train: 0.9786, Val: 0.8180, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0277, Loss: 1.0285 Train: 0.9786, Val: 0.8180, Test: 0.8470, Final Test: 0.8430\n",
"Epoch: 0278, Loss: 1.2738 Train: 0.9786, Val: 0.8140, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0279, Loss: 0.9091 Train: 0.9786, Val: 0.8120, Test: 0.8500, Final Test: 0.8430\n",
"Epoch: 0280, Loss: 0.9446 Train: 0.9786, Val: 0.8160, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0281, Loss: 0.8910 Train: 0.9643, Val: 0.8160, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0282, Loss: 0.9930 Train: 0.9643, Val: 0.8120, Test: 0.8470, Final Test: 0.8430\n",
"Epoch: 0283, Loss: 1.0465 Train: 0.9571, Val: 0.8100, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0284, Loss: 0.8807 Train: 0.9571, Val: 0.8080, Test: 0.8410, Final Test: 0.8430\n",
"Epoch: 0285, Loss: 0.9013 Train: 0.9643, Val: 0.8120, Test: 0.8410, Final Test: 0.8430\n",
"Epoch: 0286, Loss: 0.9517 Train: 0.9643, Val: 0.8160, Test: 0.8370, Final Test: 0.8430\n",
"Epoch: 0287, Loss: 0.9282 Train: 0.9643, Val: 0.8200, Test: 0.8410, Final Test: 0.8430\n",
"Epoch: 0288, Loss: 0.8842 Train: 0.9714, Val: 0.8240, Test: 0.8430, Final Test: 0.8430\n",
"Epoch: 0289, Loss: 0.9477 Train: 0.9714, Val: 0.8220, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0290, Loss: 0.9267 Train: 0.9714, Val: 0.8200, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0291, Loss: 0.9288 Train: 0.9786, Val: 0.8240, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0292, Loss: 0.8249 Train: 0.9786, Val: 0.8200, Test: 0.8490, Final Test: 0.8430\n",
"Epoch: 0293, Loss: 0.9213 Train: 0.9786, Val: 0.8160, Test: 0.8500, Final Test: 0.8430\n",
"Epoch: 0294, Loss: 0.9294 Train: 0.9786, Val: 0.8120, Test: 0.8570, Final Test: 0.8430\n",
"Epoch: 0295, Loss: 0.8483 Train: 0.9786, Val: 0.8180, Test: 0.8560, Final Test: 0.8430\n",
"Epoch: 0296, Loss: 0.9735 Train: 0.9786, Val: 0.8220, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0297, Loss: 0.9152 Train: 0.9857, Val: 0.8280, Test: 0.8490, Final Test: 0.8430\n",
"Epoch: 0298, Loss: 0.8807 Train: 0.9857, Val: 0.8240, Test: 0.8490, Final Test: 0.8430\n",
"Epoch: 0299, Loss: 1.2930 Train: 0.9857, Val: 0.8280, Test: 0.8550, Final Test: 0.8430\n",
"Epoch: 0300, Loss: 0.9727 Train: 0.9714, Val: 0.8300, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0301, Loss: 0.9381 Train: 0.9714, Val: 0.8280, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0302, Loss: 0.9046 Train: 0.9714, Val: 0.8260, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0303, Loss: 0.9381 Train: 0.9643, Val: 0.8260, Test: 0.8460, Final Test: 0.8430\n",
"Epoch: 0304, Loss: 0.9733 Train: 0.9643, Val: 0.8260, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0305, Loss: 0.9470 Train: 0.9643, Val: 0.8280, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0306, Loss: 0.9045 Train: 0.9643, Val: 0.8280, Test: 0.8420, Final Test: 0.8430\n",
"Epoch: 0307, Loss: 0.8821 Train: 0.9571, Val: 0.8260, Test: 0.8390, Final Test: 0.8430\n",
"Epoch: 0308, Loss: 0.8136 Train: 0.9571, Val: 0.8240, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0309, Loss: 0.8884 Train: 0.9571, Val: 0.8240, Test: 0.8420, Final Test: 0.8430\n",
"Epoch: 0310, Loss: 0.9682 Train: 0.9571, Val: 0.8260, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0311, Loss: 0.8817 Train: 0.9571, Val: 0.8260, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0312, Loss: 0.8477 Train: 0.9643, Val: 0.8260, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0313, Loss: 0.9541 Train: 0.9714, Val: 0.8300, Test: 0.8500, Final Test: 0.8430\n",
"Epoch: 0314, Loss: 0.8849 Train: 0.9714, Val: 0.8260, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0315, Loss: 0.9296 Train: 0.9714, Val: 0.8200, Test: 0.8540, Final Test: 0.8430\n",
"Epoch: 0316, Loss: 0.9546 Train: 0.9786, Val: 0.8180, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0317, Loss: 0.9825 Train: 0.9786, Val: 0.8160, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0318, Loss: 0.8726 Train: 0.9786, Val: 0.8200, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0319, Loss: 0.9199 Train: 0.9714, Val: 0.8240, Test: 0.8490, Final Test: 0.8430\n",
"Epoch: 0320, Loss: 0.8825 Train: 0.9714, Val: 0.8220, Test: 0.8500, Final Test: 0.8430\n",
"Epoch: 0321, Loss: 0.9425 Train: 0.9714, Val: 0.8260, Test: 0.8500, Final Test: 0.8430\n",
"Epoch: 0322, Loss: 0.9369 Train: 0.9714, Val: 0.8200, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0323, Loss: 0.8965 Train: 0.9714, Val: 0.8220, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0324, Loss: 0.9405 Train: 0.9714, Val: 0.8200, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0325, Loss: 0.9288 Train: 0.9714, Val: 0.8240, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0326, Loss: 1.0131 Train: 0.9714, Val: 0.8220, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0327, Loss: 0.8734 Train: 0.9714, Val: 0.8220, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0328, Loss: 0.8546 Train: 0.9714, Val: 0.8220, Test: 0.8510, Final Test: 0.8430\n",
"Epoch: 0329, Loss: 0.8425 Train: 0.9714, Val: 0.8220, Test: 0.8560, Final Test: 0.8430\n",
"Epoch: 0330, Loss: 0.9020 Train: 0.9714, Val: 0.8180, Test: 0.8550, Final Test: 0.8430\n",
"Epoch: 0331, Loss: 0.8081 Train: 0.9714, Val: 0.8240, Test: 0.8560, Final Test: 0.8430\n",
"Epoch: 0332, Loss: 0.9260 Train: 0.9714, Val: 0.8240, Test: 0.8580, Final Test: 0.8430\n",
"Epoch: 0333, Loss: 1.0115 Train: 0.9786, Val: 0.8180, Test: 0.8560, Final Test: 0.8430\n",
"Epoch: 0334, Loss: 1.0361 Train: 0.9786, Val: 0.8120, Test: 0.8550, Final Test: 0.8430\n",
"Epoch: 0335, Loss: 1.0250 Train: 0.9786, Val: 0.8140, Test: 0.8560, Final Test: 0.8430\n",
"Epoch: 0336, Loss: 0.9239 Train: 0.9786, Val: 0.8120, Test: 0.8540, Final Test: 0.8430\n",
"Epoch: 0337, Loss: 0.8864 Train: 0.9857, Val: 0.8100, Test: 0.8530, Final Test: 0.8430\n",
"Epoch: 0338, Loss: 0.9487 Train: 0.9857, Val: 0.8140, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0339, Loss: 1.0333 Train: 0.9857, Val: 0.8140, Test: 0.8510, Final Test: 0.8430\n",
"Epoch: 0340, Loss: 0.8890 Train: 0.9786, Val: 0.8200, Test: 0.8510, Final Test: 0.8430\n",
"Epoch: 0341, Loss: 0.8360 Train: 0.9786, Val: 0.8220, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0342, Loss: 0.9248 Train: 0.9786, Val: 0.8240, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0343, Loss: 0.8522 Train: 0.9786, Val: 0.8240, Test: 0.8410, Final Test: 0.8430\n",
"Epoch: 0344, Loss: 0.7925 Train: 0.9786, Val: 0.8220, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0345, Loss: 0.8802 Train: 0.9714, Val: 0.8220, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0346, Loss: 0.8593 Train: 0.9714, Val: 0.8220, Test: 0.8510, Final Test: 0.8430\n",
"Epoch: 0347, Loss: 0.9431 Train: 0.9714, Val: 0.8200, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0348, Loss: 0.8651 Train: 0.9643, Val: 0.8200, Test: 0.8460, Final Test: 0.8430\n",
"Epoch: 0349, Loss: 0.9320 Train: 0.9643, Val: 0.8260, Test: 0.8460, Final Test: 0.8430\n",
"Epoch: 0350, Loss: 0.9708 Train: 0.9643, Val: 0.8300, Test: 0.8460, Final Test: 0.8430\n",
"Epoch: 0351, Loss: 0.8071 Train: 0.9643, Val: 0.8280, Test: 0.8460, Final Test: 0.8430\n",
"Epoch: 0352, Loss: 0.9295 Train: 0.9571, Val: 0.8260, Test: 0.8450, Final Test: 0.8430\n",
"Epoch: 0353, Loss: 0.8640 Train: 0.9643, Val: 0.8240, Test: 0.8440, Final Test: 0.8430\n",
"Epoch: 0354, Loss: 0.8147 Train: 0.9643, Val: 0.8220, Test: 0.8470, Final Test: 0.8430\n",
"Epoch: 0355, Loss: 0.9458 Train: 0.9714, Val: 0.8280, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0356, Loss: 0.8760 Train: 0.9714, Val: 0.8280, Test: 0.8490, Final Test: 0.8430\n",
"Epoch: 0357, Loss: 0.8947 Train: 0.9714, Val: 0.8280, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0358, Loss: 0.8302 Train: 0.9714, Val: 0.8240, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0359, Loss: 0.8724 Train: 0.9643, Val: 0.8160, Test: 0.8520, Final Test: 0.8430\n",
"Epoch: 0360, Loss: 0.9597 Train: 0.9714, Val: 0.8200, Test: 0.8510, Final Test: 0.8430\n",
"Epoch: 0361, Loss: 0.8439 Train: 0.9714, Val: 0.8180, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0362, Loss: 0.8774 Train: 0.9714, Val: 0.8120, Test: 0.8460, Final Test: 0.8430\n",
"Epoch: 0363, Loss: 0.9724 Train: 0.9714, Val: 0.8180, Test: 0.8480, Final Test: 0.8430\n",
"Epoch: 0364, Loss: 0.9150 Train: 0.9714, Val: 0.8180, Test: 0.8540, Final Test: 0.8430\n",
"Epoch: 0365, Loss: 0.8400 Train: 0.9714, Val: 0.8240, Test: 0.8560, Final Test: 0.8430\n",
"Epoch: 0366, Loss: 0.8702 Train: 0.9714, Val: 0.8300, Test: 0.8580, Final Test: 0.8430\n",
"Epoch: 0367, Loss: 0.8602 Train: 0.9643, Val: 0.8320, Test: 0.8570, Final Test: 0.8570\n",
"Epoch: 0368, Loss: 0.7996 Train: 0.9643, Val: 0.8300, Test: 0.8560, Final Test: 0.8570\n",
"Epoch: 0369, Loss: 0.8429 Train: 0.9643, Val: 0.8300, Test: 0.8550, Final Test: 0.8570\n",
"Epoch: 0370, Loss: 0.9029 Train: 0.9643, Val: 0.8240, Test: 0.8550, Final Test: 0.8570\n",
"Epoch: 0371, Loss: 0.8675 Train: 0.9714, Val: 0.8220, Test: 0.8580, Final Test: 0.8570\n",
"Epoch: 0372, Loss: 0.8863 Train: 0.9786, Val: 0.8240, Test: 0.8580, Final Test: 0.8570\n",
"Epoch: 0373, Loss: 0.8376 Train: 0.9786, Val: 0.8260, Test: 0.8580, Final Test: 0.8570\n",
"Epoch: 0374, Loss: 0.8671 Train: 0.9786, Val: 0.8280, Test: 0.8570, Final Test: 0.8570\n",
"Epoch: 0375, Loss: 0.7907 Train: 0.9714, Val: 0.8260, Test: 0.8550, Final Test: 0.8570\n",
"Epoch: 0376, Loss: 0.8154 Train: 0.9714, Val: 0.8220, Test: 0.8550, Final Test: 0.8570\n",
"Epoch: 0377, Loss: 0.9570 Train: 0.9714, Val: 0.8220, Test: 0.8530, Final Test: 0.8570\n",
"Epoch: 0378, Loss: 0.8546 Train: 0.9714, Val: 0.8260, Test: 0.8530, Final Test: 0.8570\n",
"Epoch: 0379, Loss: 0.8899 Train: 0.9714, Val: 0.8200, Test: 0.8490, Final Test: 0.8570\n",
"Epoch: 0380, Loss: 0.8818 Train: 0.9714, Val: 0.8200, Test: 0.8500, Final Test: 0.8570\n",
"Epoch: 0381, Loss: 0.7514 Train: 0.9643, Val: 0.8200, Test: 0.8520, Final Test: 0.8570\n",
"Epoch: 0382, Loss: 0.9321 Train: 0.9571, Val: 0.8100, Test: 0.8510, Final Test: 0.8570\n",
"Epoch: 0383, Loss: 0.8478 Train: 0.9714, Val: 0.8140, Test: 0.8480, Final Test: 0.8570\n",
"Epoch: 0384, Loss: 0.8036 Train: 0.9714, Val: 0.8200, Test: 0.8500, Final Test: 0.8570\n",
"Epoch: 0385, Loss: 0.8431 Train: 0.9714, Val: 0.8300, Test: 0.8510, Final Test: 0.8570\n",
"Epoch: 0386, Loss: 0.8024 Train: 0.9714, Val: 0.8300, Test: 0.8490, Final Test: 0.8570\n",
"Epoch: 0387, Loss: 0.8896 Train: 0.9714, Val: 0.8380, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0388, Loss: 0.7929 Train: 0.9643, Val: 0.8360, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0389, Loss: 0.9242 Train: 0.9643, Val: 0.8380, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0390, Loss: 0.8745 Train: 0.9714, Val: 0.8320, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0391, Loss: 0.9870 Train: 0.9786, Val: 0.8280, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0392, Loss: 0.9419 Train: 0.9786, Val: 0.8260, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0393, Loss: 0.8447 Train: 0.9786, Val: 0.8200, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0394, Loss: 0.9472 Train: 0.9786, Val: 0.8240, Test: 0.8440, Final Test: 0.8460\n",
"Epoch: 0395, Loss: 0.8500 Train: 0.9714, Val: 0.8200, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0396, Loss: 0.8033 Train: 0.9714, Val: 0.8240, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0397, Loss: 0.8986 Train: 0.9714, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0398, Loss: 1.0511 Train: 0.9714, Val: 0.8240, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0399, Loss: 0.9164 Train: 0.9714, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0400, Loss: 0.9661 Train: 0.9714, Val: 0.8160, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0401, Loss: 0.8784 Train: 0.9714, Val: 0.8140, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0402, Loss: 0.8418 Train: 0.9714, Val: 0.8160, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0403, Loss: 0.8502 Train: 0.9714, Val: 0.8160, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0404, Loss: 0.7990 Train: 0.9714, Val: 0.8200, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0405, Loss: 0.8957 Train: 0.9714, Val: 0.8280, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0406, Loss: 0.9187 Train: 0.9714, Val: 0.8260, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0407, Loss: 0.8550 Train: 0.9714, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0408, Loss: 0.8590 Train: 0.9786, Val: 0.8280, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0409, Loss: 0.8596 Train: 0.9786, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0410, Loss: 0.7993 Train: 0.9714, Val: 0.8220, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0411, Loss: 0.9184 Train: 0.9714, Val: 0.8180, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0412, Loss: 0.8727 Train: 0.9643, Val: 0.8140, Test: 0.8450, Final Test: 0.8460\n",
"Epoch: 0413, Loss: 0.9266 Train: 0.9643, Val: 0.8160, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0414, Loss: 0.7438 Train: 0.9643, Val: 0.8160, Test: 0.8470, Final Test: 0.8460\n",
"Epoch: 0415, Loss: 0.8132 Train: 0.9714, Val: 0.8160, Test: 0.8470, Final Test: 0.8460\n",
"Epoch: 0416, Loss: 0.8372 Train: 0.9714, Val: 0.8120, Test: 0.8470, Final Test: 0.8460\n",
"Epoch: 0417, Loss: 0.8459 Train: 0.9714, Val: 0.8100, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0418, Loss: 0.9017 Train: 0.9714, Val: 0.8120, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0419, Loss: 0.9180 Train: 0.9786, Val: 0.8140, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0420, Loss: 0.9135 Train: 0.9786, Val: 0.8220, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0421, Loss: 0.8469 Train: 0.9786, Val: 0.8200, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0422, Loss: 0.8333 Train: 0.9786, Val: 0.8200, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0423, Loss: 0.8655 Train: 0.9714, Val: 0.8220, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0424, Loss: 0.9376 Train: 0.9643, Val: 0.8260, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0425, Loss: 0.9261 Train: 0.9643, Val: 0.8240, Test: 0.8440, Final Test: 0.8460\n",
"Epoch: 0426, Loss: 0.8571 Train: 0.9643, Val: 0.8200, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0427, Loss: 0.8902 Train: 0.9643, Val: 0.8220, Test: 0.8420, Final Test: 0.8460\n",
"Epoch: 0428, Loss: 0.8596 Train: 0.9643, Val: 0.8220, Test: 0.8380, Final Test: 0.8460\n",
"Epoch: 0429, Loss: 0.9028 Train: 0.9643, Val: 0.8280, Test: 0.8400, Final Test: 0.8460\n",
"Epoch: 0430, Loss: 0.8117 Train: 0.9714, Val: 0.8320, Test: 0.8430, Final Test: 0.8460\n",
"Epoch: 0431, Loss: 0.8595 Train: 0.9714, Val: 0.8300, Test: 0.8430, Final Test: 0.8460\n",
"Epoch: 0432, Loss: 0.8528 Train: 0.9714, Val: 0.8300, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0433, Loss: 0.9284 Train: 0.9714, Val: 0.8280, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0434, Loss: 0.9028 Train: 0.9714, Val: 0.8300, Test: 0.8470, Final Test: 0.8460\n",
"Epoch: 0435, Loss: 0.7448 Train: 0.9714, Val: 0.8320, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0436, Loss: 0.7764 Train: 0.9786, Val: 0.8260, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0437, Loss: 0.7423 Train: 0.9714, Val: 0.8180, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0438, Loss: 0.8408 Train: 0.9643, Val: 0.8080, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0439, Loss: 0.8787 Train: 0.9714, Val: 0.8140, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0440, Loss: 0.8019 Train: 0.9714, Val: 0.8100, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0441, Loss: 0.9706 Train: 0.9714, Val: 0.8180, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0442, Loss: 0.8047 Train: 0.9714, Val: 0.8220, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0443, Loss: 0.7968 Train: 0.9714, Val: 0.8200, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0444, Loss: 0.7832 Train: 0.9786, Val: 0.8240, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0445, Loss: 0.8591 Train: 0.9786, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0446, Loss: 0.8288 Train: 0.9786, Val: 0.8300, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0447, Loss: 0.8797 Train: 0.9786, Val: 0.8240, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0448, Loss: 0.8992 Train: 0.9786, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0449, Loss: 0.8988 Train: 0.9786, Val: 0.8260, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0450, Loss: 0.8705 Train: 0.9786, Val: 0.8280, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0451, Loss: 0.9503 Train: 0.9714, Val: 0.8260, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0452, Loss: 0.8543 Train: 0.9786, Val: 0.8300, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0453, Loss: 0.8019 Train: 0.9786, Val: 0.8300, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0454, Loss: 0.8696 Train: 0.9786, Val: 0.8280, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0455, Loss: 0.8261 Train: 0.9714, Val: 0.8180, Test: 0.8440, Final Test: 0.8460\n",
"Epoch: 0456, Loss: 0.7876 Train: 0.9714, Val: 0.8200, Test: 0.8430, Final Test: 0.8460\n",
"Epoch: 0457, Loss: 0.7697 Train: 0.9714, Val: 0.8140, Test: 0.8470, Final Test: 0.8460\n",
"Epoch: 0458, Loss: 0.8388 Train: 0.9643, Val: 0.8160, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0459, Loss: 0.8829 Train: 0.9643, Val: 0.8180, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0460, Loss: 0.7712 Train: 0.9643, Val: 0.8220, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0461, Loss: 0.8822 Train: 0.9643, Val: 0.8220, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0462, Loss: 0.8220 Train: 0.9714, Val: 0.8200, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0463, Loss: 0.8202 Train: 0.9643, Val: 0.8200, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0464, Loss: 0.9055 Train: 0.9643, Val: 0.8120, Test: 0.8440, Final Test: 0.8460\n",
"Epoch: 0465, Loss: 0.8934 Train: 0.9643, Val: 0.8100, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0466, Loss: 0.8811 Train: 0.9714, Val: 0.8120, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0467, Loss: 0.7742 Train: 0.9643, Val: 0.8140, Test: 0.8420, Final Test: 0.8460\n",
"Epoch: 0468, Loss: 0.8170 Train: 0.9643, Val: 0.8080, Test: 0.8420, Final Test: 0.8460\n",
"Epoch: 0469, Loss: 0.8599 Train: 0.9643, Val: 0.8100, Test: 0.8410, Final Test: 0.8460\n",
"Epoch: 0470, Loss: 0.9247 Train: 0.9714, Val: 0.8140, Test: 0.8440, Final Test: 0.8460\n",
"Epoch: 0471, Loss: 0.7842 Train: 0.9714, Val: 0.8160, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0472, Loss: 0.7386 Train: 0.9786, Val: 0.8200, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0473, Loss: 0.8234 Train: 0.9786, Val: 0.8180, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0474, Loss: 0.7849 Train: 0.9786, Val: 0.8320, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0475, Loss: 0.8797 Train: 0.9786, Val: 0.8340, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0476, Loss: 0.7919 Train: 0.9857, Val: 0.8380, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0477, Loss: 0.9064 Train: 0.9857, Val: 0.8360, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0478, Loss: 0.8399 Train: 0.9857, Val: 0.8320, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0479, Loss: 0.8438 Train: 0.9857, Val: 0.8340, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0480, Loss: 0.8844 Train: 0.9786, Val: 0.8340, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0481, Loss: 1.1576 Train: 0.9857, Val: 0.8280, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0482, Loss: 0.8215 Train: 0.9857, Val: 0.8240, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0483, Loss: 0.8285 Train: 0.9857, Val: 0.8220, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0484, Loss: 0.8546 Train: 0.9857, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0485, Loss: 0.8226 Train: 0.9857, Val: 0.8140, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0486, Loss: 0.8099 Train: 0.9857, Val: 0.8120, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0487, Loss: 0.8495 Train: 0.9857, Val: 0.8140, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0488, Loss: 0.7798 Train: 0.9857, Val: 0.8140, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0489, Loss: 0.8571 Train: 0.9857, Val: 0.8100, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0490, Loss: 0.9137 Train: 0.9857, Val: 0.8100, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0491, Loss: 0.8118 Train: 0.9857, Val: 0.8100, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0492, Loss: 0.8269 Train: 0.9857, Val: 0.8080, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0493, Loss: 0.8110 Train: 0.9786, Val: 0.8100, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0494, Loss: 0.8545 Train: 0.9786, Val: 0.8120, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0495, Loss: 0.8804 Train: 0.9786, Val: 0.8200, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0496, Loss: 0.8429 Train: 0.9786, Val: 0.8220, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0497, Loss: 0.8500 Train: 0.9786, Val: 0.8220, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0498, Loss: 0.7634 Train: 0.9786, Val: 0.8240, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0499, Loss: 0.9066 Train: 0.9786, Val: 0.8200, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0500, Loss: 0.8394 Train: 0.9786, Val: 0.8200, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0501, Loss: 0.8720 Train: 0.9857, Val: 0.8200, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0502, Loss: 0.8022 Train: 0.9857, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0503, Loss: 0.8284 Train: 0.9857, Val: 0.8200, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0504, Loss: 0.7392 Train: 0.9786, Val: 0.8200, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0505, Loss: 0.8013 Train: 0.9786, Val: 0.8240, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0506, Loss: 0.8331 Train: 0.9714, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0507, Loss: 0.9428 Train: 0.9643, Val: 0.8160, Test: 0.8450, Final Test: 0.8460\n",
"Epoch: 0508, Loss: 0.7554 Train: 0.9643, Val: 0.8160, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0509, Loss: 0.7543 Train: 0.9643, Val: 0.8160, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0510, Loss: 0.7968 Train: 0.9857, Val: 0.8220, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0511, Loss: 0.8360 Train: 0.9857, Val: 0.8200, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0512, Loss: 0.8173 Train: 0.9857, Val: 0.8180, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0513, Loss: 0.7200 Train: 0.9857, Val: 0.8220, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0514, Loss: 0.8164 Train: 0.9857, Val: 0.8280, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0515, Loss: 0.8054 Train: 0.9857, Val: 0.8280, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0516, Loss: 0.7896 Train: 0.9857, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0517, Loss: 0.8713 Train: 0.9857, Val: 0.8240, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0518, Loss: 0.8315 Train: 0.9786, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0519, Loss: 0.7982 Train: 0.9786, Val: 0.8220, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0520, Loss: 0.7951 Train: 0.9786, Val: 0.8240, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0521, Loss: 0.8485 Train: 0.9714, Val: 0.8200, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0522, Loss: 0.6961 Train: 0.9714, Val: 0.8180, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0523, Loss: 0.8082 Train: 0.9714, Val: 0.8180, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0524, Loss: 0.7986 Train: 0.9643, Val: 0.8180, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0525, Loss: 0.7892 Train: 0.9643, Val: 0.8220, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0526, Loss: 0.8231 Train: 0.9643, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0527, Loss: 0.8297 Train: 0.9643, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0528, Loss: 0.8097 Train: 0.9643, Val: 0.8200, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0529, Loss: 0.8310 Train: 0.9643, Val: 0.8220, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0530, Loss: 0.8603 Train: 0.9643, Val: 0.8220, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0531, Loss: 0.8026 Train: 0.9714, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0532, Loss: 0.8027 Train: 0.9714, Val: 0.8260, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0533, Loss: 0.7921 Train: 0.9714, Val: 0.8280, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0534, Loss: 0.7663 Train: 0.9714, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0535, Loss: 0.8090 Train: 0.9714, Val: 0.8260, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0536, Loss: 0.7125 Train: 0.9714, Val: 0.8300, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0537, Loss: 0.8001 Train: 0.9714, Val: 0.8300, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0538, Loss: 0.7815 Train: 0.9786, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0539, Loss: 0.8570 Train: 0.9786, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0540, Loss: 0.7758 Train: 0.9786, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0541, Loss: 0.7745 Train: 0.9786, Val: 0.8240, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0542, Loss: 0.6751 Train: 0.9786, Val: 0.8220, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0543, Loss: 0.7423 Train: 0.9786, Val: 0.8160, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0544, Loss: 0.8875 Train: 0.9714, Val: 0.8160, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0545, Loss: 0.8601 Train: 0.9714, Val: 0.8200, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0546, Loss: 0.7449 Train: 0.9714, Val: 0.8280, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0547, Loss: 0.9113 Train: 0.9714, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0548, Loss: 0.7872 Train: 0.9714, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0549, Loss: 0.8719 Train: 0.9714, Val: 0.8260, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0550, Loss: 0.8010 Train: 0.9714, Val: 0.8260, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0551, Loss: 0.8557 Train: 0.9786, Val: 0.8220, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0552, Loss: 0.7609 Train: 0.9857, Val: 0.8260, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0553, Loss: 0.7760 Train: 0.9857, Val: 0.8280, Test: 0.8640, Final Test: 0.8460\n",
"Epoch: 0554, Loss: 0.7783 Train: 0.9857, Val: 0.8280, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0555, Loss: 0.7433 Train: 0.9857, Val: 0.8280, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0556, Loss: 0.8483 Train: 0.9857, Val: 0.8340, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0557, Loss: 0.8190 Train: 0.9857, Val: 0.8320, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0558, Loss: 0.7493 Train: 0.9857, Val: 0.8220, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0559, Loss: 0.7424 Train: 0.9857, Val: 0.8240, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0560, Loss: 0.7546 Train: 0.9857, Val: 0.8240, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0561, Loss: 0.8109 Train: 0.9857, Val: 0.8240, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0562, Loss: 0.8559 Train: 0.9857, Val: 0.8180, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0563, Loss: 0.9093 Train: 0.9857, Val: 0.8220, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0564, Loss: 0.8063 Train: 0.9857, Val: 0.8260, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0565, Loss: 0.7652 Train: 0.9857, Val: 0.8340, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0566, Loss: 0.7899 Train: 0.9857, Val: 0.8240, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0567, Loss: 0.8577 Train: 0.9786, Val: 0.8260, Test: 0.8450, Final Test: 0.8460\n",
"Epoch: 0568, Loss: 0.8359 Train: 0.9786, Val: 0.8280, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0569, Loss: 0.7815 Train: 0.9714, Val: 0.8300, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0570, Loss: 0.8144 Train: 0.9714, Val: 0.8280, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0571, Loss: 0.7481 Train: 0.9714, Val: 0.8260, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0572, Loss: 0.7681 Train: 0.9714, Val: 0.8300, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0573, Loss: 0.7674 Train: 0.9714, Val: 0.8240, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0574, Loss: 0.7088 Train: 0.9714, Val: 0.8240, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0575, Loss: 0.8322 Train: 0.9643, Val: 0.8220, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0576, Loss: 0.7642 Train: 0.9643, Val: 0.8200, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0577, Loss: 0.8030 Train: 0.9643, Val: 0.8200, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0578, Loss: 0.7716 Train: 0.9643, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0579, Loss: 0.7989 Train: 0.9714, Val: 0.8240, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0580, Loss: 0.7668 Train: 0.9714, Val: 0.8320, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0581, Loss: 0.7635 Train: 0.9714, Val: 0.8300, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0582, Loss: 0.8365 Train: 0.9714, Val: 0.8300, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0583, Loss: 0.8324 Train: 0.9786, Val: 0.8340, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0584, Loss: 0.8413 Train: 0.9857, Val: 0.8320, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0585, Loss: 0.8543 Train: 0.9786, Val: 0.8220, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0586, Loss: 0.8684 Train: 0.9714, Val: 0.8160, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0587, Loss: 0.7368 Train: 0.9714, Val: 0.8140, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0588, Loss: 0.7368 Train: 0.9857, Val: 0.8200, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0589, Loss: 0.7950 Train: 0.9857, Val: 0.8220, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0590, Loss: 0.7483 Train: 0.9786, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0591, Loss: 0.8109 Train: 0.9786, Val: 0.8280, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0592, Loss: 0.8305 Train: 0.9786, Val: 0.8320, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0593, Loss: 0.7356 Train: 0.9786, Val: 0.8300, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0594, Loss: 0.7595 Train: 0.9786, Val: 0.8320, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0595, Loss: 0.7014 Train: 0.9786, Val: 0.8320, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0596, Loss: 0.9535 Train: 0.9786, Val: 0.8260, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0597, Loss: 0.8612 Train: 0.9786, Val: 0.8260, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0598, Loss: 0.7822 Train: 0.9714, Val: 0.8260, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0599, Loss: 0.7478 Train: 0.9714, Val: 0.8260, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0600, Loss: 0.8044 Train: 0.9643, Val: 0.8200, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0601, Loss: 0.9017 Train: 0.9643, Val: 0.8200, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0602, Loss: 0.7398 Train: 0.9714, Val: 0.8160, Test: 0.8470, Final Test: 0.8460\n",
"Epoch: 0603, Loss: 0.8423 Train: 0.9786, Val: 0.8200, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0604, Loss: 1.0300 Train: 0.9786, Val: 0.8180, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0605, Loss: 0.8502 Train: 0.9714, Val: 0.8180, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0606, Loss: 0.8095 Train: 0.9714, Val: 0.8240, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0607, Loss: 0.8998 Train: 0.9714, Val: 0.8300, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0608, Loss: 0.7138 Train: 0.9714, Val: 0.8300, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0609, Loss: 0.8431 Train: 0.9714, Val: 0.8300, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0610, Loss: 0.8079 Train: 0.9714, Val: 0.8360, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0611, Loss: 0.8194 Train: 0.9714, Val: 0.8380, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0612, Loss: 0.8297 Train: 0.9786, Val: 0.8360, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0613, Loss: 0.7177 Train: 0.9786, Val: 0.8340, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0614, Loss: 0.7796 Train: 0.9786, Val: 0.8340, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0615, Loss: 0.7818 Train: 0.9786, Val: 0.8340, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0616, Loss: 0.8213 Train: 0.9786, Val: 0.8340, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0617, Loss: 0.8401 Train: 0.9786, Val: 0.8320, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0618, Loss: 0.8161 Train: 0.9786, Val: 0.8340, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0619, Loss: 0.8688 Train: 0.9786, Val: 0.8360, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0620, Loss: 0.6842 Train: 0.9786, Val: 0.8360, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0621, Loss: 0.7166 Train: 0.9786, Val: 0.8360, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0622, Loss: 0.8489 Train: 0.9786, Val: 0.8340, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0623, Loss: 0.7535 Train: 0.9786, Val: 0.8300, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0624, Loss: 0.8730 Train: 0.9786, Val: 0.8260, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0625, Loss: 0.7717 Train: 0.9786, Val: 0.8260, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0626, Loss: 0.8859 Train: 0.9786, Val: 0.8260, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0627, Loss: 0.9420 Train: 0.9786, Val: 0.8280, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0628, Loss: 0.8017 Train: 0.9786, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0629, Loss: 0.7708 Train: 0.9786, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0630, Loss: 0.7531 Train: 0.9786, Val: 0.8180, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0631, Loss: 0.7476 Train: 0.9786, Val: 0.8160, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0632, Loss: 0.8459 Train: 0.9714, Val: 0.8200, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0633, Loss: 0.7607 Train: 0.9714, Val: 0.8140, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0634, Loss: 0.7917 Train: 0.9714, Val: 0.8120, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0635, Loss: 0.8150 Train: 0.9714, Val: 0.8140, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0636, Loss: 0.8646 Train: 0.9714, Val: 0.8120, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0637, Loss: 0.8207 Train: 0.9714, Val: 0.8100, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0638, Loss: 0.8526 Train: 0.9786, Val: 0.8180, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0639, Loss: 0.7714 Train: 0.9786, Val: 0.8260, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0640, Loss: 0.7796 Train: 0.9714, Val: 0.8320, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0641, Loss: 0.8392 Train: 0.9714, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0642, Loss: 0.8303 Train: 0.9714, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0643, Loss: 0.9450 Train: 0.9714, Val: 0.8220, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0644, Loss: 0.8306 Train: 0.9786, Val: 0.8300, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0645, Loss: 1.0067 Train: 0.9714, Val: 0.8360, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0646, Loss: 0.7438 Train: 0.9786, Val: 0.8360, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0647, Loss: 0.8551 Train: 0.9786, Val: 0.8320, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0648, Loss: 0.7824 Train: 0.9857, Val: 0.8300, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0649, Loss: 0.8598 Train: 0.9857, Val: 0.8320, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0650, Loss: 0.7855 Train: 0.9857, Val: 0.8300, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0651, Loss: 0.8303 Train: 0.9857, Val: 0.8260, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0652, Loss: 0.7857 Train: 0.9857, Val: 0.8220, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0653, Loss: 0.8057 Train: 0.9857, Val: 0.8260, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0654, Loss: 0.8289 Train: 0.9857, Val: 0.8240, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0655, Loss: 0.8204 Train: 0.9857, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0656, Loss: 0.7179 Train: 0.9857, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0657, Loss: 0.7040 Train: 0.9857, Val: 0.8240, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0658, Loss: 0.7901 Train: 0.9786, Val: 0.8280, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0659, Loss: 0.8100 Train: 0.9786, Val: 0.8280, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0660, Loss: 0.8614 Train: 0.9786, Val: 0.8260, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0661, Loss: 0.7911 Train: 0.9786, Val: 0.8240, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0662, Loss: 0.7222 Train: 0.9786, Val: 0.8220, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0663, Loss: 0.8608 Train: 0.9786, Val: 0.8200, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0664, Loss: 0.8586 Train: 0.9786, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0665, Loss: 0.7878 Train: 0.9714, Val: 0.8240, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0666, Loss: 0.7323 Train: 0.9714, Val: 0.8240, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0667, Loss: 0.7623 Train: 0.9714, Val: 0.8240, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0668, Loss: 0.8756 Train: 0.9786, Val: 0.8240, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0669, Loss: 0.7740 Train: 0.9786, Val: 0.8200, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0670, Loss: 0.9374 Train: 0.9786, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0671, Loss: 0.7736 Train: 0.9714, Val: 0.8220, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0672, Loss: 0.7372 Train: 0.9643, Val: 0.8240, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0673, Loss: 0.7954 Train: 0.9643, Val: 0.8220, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0674, Loss: 0.7913 Train: 0.9714, Val: 0.8280, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0675, Loss: 0.9338 Train: 0.9714, Val: 0.8300, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0676, Loss: 0.8463 Train: 0.9714, Val: 0.8300, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0677, Loss: 0.7733 Train: 0.9786, Val: 0.8300, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0678, Loss: 0.7627 Train: 0.9786, Val: 0.8300, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0679, Loss: 0.8369 Train: 0.9786, Val: 0.8260, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0680, Loss: 0.8792 Train: 0.9786, Val: 0.8240, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0681, Loss: 0.7713 Train: 0.9786, Val: 0.8220, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0682, Loss: 0.8288 Train: 0.9786, Val: 0.8240, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0683, Loss: 0.8347 Train: 0.9786, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0684, Loss: 0.7827 Train: 0.9786, Val: 0.8300, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0685, Loss: 0.7346 Train: 0.9786, Val: 0.8300, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0686, Loss: 0.7115 Train: 0.9786, Val: 0.8300, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0687, Loss: 0.9057 Train: 0.9857, Val: 0.8240, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0688, Loss: 0.7973 Train: 0.9786, Val: 0.8220, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0689, Loss: 0.8335 Train: 0.9786, Val: 0.8200, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0690, Loss: 0.8630 Train: 0.9786, Val: 0.8160, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0691, Loss: 0.6806 Train: 0.9786, Val: 0.8160, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0692, Loss: 0.8711 Train: 0.9786, Val: 0.8180, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0693, Loss: 0.7843 Train: 0.9786, Val: 0.8200, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0694, Loss: 0.8128 Train: 0.9786, Val: 0.8260, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0695, Loss: 0.7607 Train: 0.9786, Val: 0.8260, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0696, Loss: 0.8644 Train: 0.9786, Val: 0.8260, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0697, Loss: 0.7537 Train: 0.9786, Val: 0.8320, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0698, Loss: 0.7893 Train: 0.9786, Val: 0.8360, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0699, Loss: 0.7656 Train: 0.9857, Val: 0.8320, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0700, Loss: 0.8128 Train: 0.9857, Val: 0.8320, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0701, Loss: 0.8188 Train: 0.9857, Val: 0.8260, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0702, Loss: 0.7492 Train: 0.9786, Val: 0.8340, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0703, Loss: 0.8260 Train: 0.9786, Val: 0.8300, Test: 0.8640, Final Test: 0.8460\n",
"Epoch: 0704, Loss: 0.7023 Train: 0.9786, Val: 0.8260, Test: 0.8650, Final Test: 0.8460\n",
"Epoch: 0705, Loss: 0.7612 Train: 0.9786, Val: 0.8260, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0706, Loss: 0.6992 Train: 0.9786, Val: 0.8220, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0707, Loss: 0.7655 Train: 0.9786, Val: 0.8240, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0708, Loss: 0.9245 Train: 0.9786, Val: 0.8240, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0709, Loss: 0.7355 Train: 0.9786, Val: 0.8280, Test: 0.8650, Final Test: 0.8460\n",
"Epoch: 0710, Loss: 0.8108 Train: 0.9786, Val: 0.8280, Test: 0.8640, Final Test: 0.8460\n",
"Epoch: 0711, Loss: 0.7820 Train: 0.9786, Val: 0.8240, Test: 0.8650, Final Test: 0.8460\n",
"Epoch: 0712, Loss: 0.7739 Train: 0.9857, Val: 0.8280, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0713, Loss: 0.7897 Train: 0.9857, Val: 0.8260, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0714, Loss: 0.7886 Train: 0.9857, Val: 0.8280, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0715, Loss: 0.8092 Train: 0.9786, Val: 0.8300, Test: 0.8650, Final Test: 0.8460\n",
"Epoch: 0716, Loss: 0.7435 Train: 0.9786, Val: 0.8320, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0717, Loss: 0.8522 Train: 0.9786, Val: 0.8280, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0718, Loss: 0.7488 Train: 0.9786, Val: 0.8260, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0719, Loss: 0.7633 Train: 0.9786, Val: 0.8260, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0720, Loss: 0.8920 Train: 0.9786, Val: 0.8280, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0721, Loss: 0.7867 Train: 0.9786, Val: 0.8280, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0722, Loss: 0.6924 Train: 0.9786, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0723, Loss: 0.7997 Train: 0.9786, Val: 0.8240, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0724, Loss: 0.8958 Train: 0.9857, Val: 0.8240, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0725, Loss: 0.7449 Train: 0.9786, Val: 0.8200, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0726, Loss: 0.7516 Train: 0.9786, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0727, Loss: 0.8092 Train: 0.9786, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0728, Loss: 0.7689 Train: 0.9786, Val: 0.8220, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0729, Loss: 0.7731 Train: 0.9786, Val: 0.8220, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0730, Loss: 0.7614 Train: 0.9786, Val: 0.8180, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0731, Loss: 0.7913 Train: 0.9786, Val: 0.8220, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0732, Loss: 0.7940 Train: 0.9786, Val: 0.8280, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0733, Loss: 0.9978 Train: 0.9857, Val: 0.8300, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0734, Loss: 0.8053 Train: 0.9786, Val: 0.8300, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0735, Loss: 0.7629 Train: 0.9786, Val: 0.8300, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0736, Loss: 0.7132 Train: 0.9786, Val: 0.8300, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0737, Loss: 0.8145 Train: 0.9786, Val: 0.8220, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0738, Loss: 0.7296 Train: 0.9786, Val: 0.8220, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0739, Loss: 0.8305 Train: 0.9786, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0740, Loss: 0.8129 Train: 0.9786, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0741, Loss: 0.7990 Train: 0.9714, Val: 0.8280, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0742, Loss: 0.8259 Train: 0.9714, Val: 0.8300, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0743, Loss: 0.7739 Train: 0.9786, Val: 0.8200, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0744, Loss: 0.8409 Train: 0.9786, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0745, Loss: 0.7679 Train: 0.9786, Val: 0.8220, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0746, Loss: 0.7449 Train: 0.9786, Val: 0.8200, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0747, Loss: 0.9164 Train: 0.9786, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0748, Loss: 0.8361 Train: 0.9786, Val: 0.8220, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0749, Loss: 0.7770 Train: 0.9714, Val: 0.8220, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0750, Loss: 0.7649 Train: 0.9714, Val: 0.8260, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0751, Loss: 0.7099 Train: 0.9714, Val: 0.8220, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0752, Loss: 0.8081 Train: 0.9714, Val: 0.8180, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0753, Loss: 0.8469 Train: 0.9714, Val: 0.8220, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0754, Loss: 0.7931 Train: 0.9714, Val: 0.8220, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0755, Loss: 0.7657 Train: 0.9714, Val: 0.8240, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0756, Loss: 0.8173 Train: 0.9714, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0757, Loss: 0.8342 Train: 0.9714, Val: 0.8260, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0758, Loss: 0.7337 Train: 0.9714, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0759, Loss: 0.7534 Train: 0.9714, Val: 0.8280, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0760, Loss: 0.7882 Train: 0.9714, Val: 0.8260, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0761, Loss: 0.8915 Train: 0.9714, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0762, Loss: 0.7995 Train: 0.9714, Val: 0.8240, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0763, Loss: 0.7513 Train: 0.9714, Val: 0.8240, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0764, Loss: 0.7512 Train: 0.9714, Val: 0.8220, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0765, Loss: 0.8412 Train: 0.9714, Val: 0.8180, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0766, Loss: 0.8262 Train: 0.9714, Val: 0.8160, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0767, Loss: 0.7228 Train: 0.9714, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0768, Loss: 0.7107 Train: 0.9714, Val: 0.8220, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0769, Loss: 0.8222 Train: 0.9786, Val: 0.8220, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0770, Loss: 0.7457 Train: 0.9786, Val: 0.8240, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0771, Loss: 0.7237 Train: 0.9786, Val: 0.8240, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0772, Loss: 0.7994 Train: 0.9786, Val: 0.8240, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0773, Loss: 0.6802 Train: 0.9786, Val: 0.8140, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0774, Loss: 0.8135 Train: 0.9786, Val: 0.8120, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0775, Loss: 0.7402 Train: 0.9786, Val: 0.8140, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0776, Loss: 0.7866 Train: 0.9786, Val: 0.8140, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0777, Loss: 0.8460 Train: 0.9786, Val: 0.8160, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0778, Loss: 0.7715 Train: 0.9786, Val: 0.8240, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0779, Loss: 0.7382 Train: 0.9714, Val: 0.8260, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0780, Loss: 0.6941 Train: 0.9643, Val: 0.8260, Test: 0.8490, Final Test: 0.8460\n",
"Epoch: 0781, Loss: 0.8333 Train: 0.9714, Val: 0.8280, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0782, Loss: 0.8281 Train: 0.9643, Val: 0.8220, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0783, Loss: 0.7483 Train: 0.9643, Val: 0.8240, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0784, Loss: 0.7514 Train: 0.9714, Val: 0.8200, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0785, Loss: 0.8037 Train: 0.9714, Val: 0.8200, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0786, Loss: 0.9263 Train: 0.9714, Val: 0.8240, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0787, Loss: 0.7517 Train: 0.9714, Val: 0.8260, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0788, Loss: 0.8048 Train: 0.9714, Val: 0.8180, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0789, Loss: 0.8255 Train: 0.9714, Val: 0.8140, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0790, Loss: 0.8268 Train: 0.9714, Val: 0.8100, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0791, Loss: 0.7975 Train: 0.9714, Val: 0.8200, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0792, Loss: 0.8309 Train: 0.9714, Val: 0.8220, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0793, Loss: 0.7877 Train: 0.9714, Val: 0.8140, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0794, Loss: 0.8399 Train: 0.9714, Val: 0.8240, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0795, Loss: 0.6668 Train: 0.9714, Val: 0.8200, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0796, Loss: 0.6957 Train: 0.9857, Val: 0.8160, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0797, Loss: 0.7469 Train: 0.9857, Val: 0.8140, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0798, Loss: 0.7874 Train: 0.9857, Val: 0.8200, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0799, Loss: 0.8129 Train: 0.9857, Val: 0.8240, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0800, Loss: 0.7762 Train: 0.9786, Val: 0.8280, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0801, Loss: 0.8856 Train: 0.9786, Val: 0.8300, Test: 0.8640, Final Test: 0.8460\n",
"Epoch: 0802, Loss: 0.6339 Train: 0.9786, Val: 0.8280, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0803, Loss: 0.6870 Train: 0.9714, Val: 0.8280, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0804, Loss: 0.7882 Train: 0.9714, Val: 0.8280, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0805, Loss: 0.7408 Train: 0.9714, Val: 0.8280, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0806, Loss: 0.8111 Train: 0.9786, Val: 0.8260, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0807, Loss: 0.8000 Train: 0.9786, Val: 0.8180, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0808, Loss: 0.7497 Train: 0.9786, Val: 0.8220, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0809, Loss: 0.7907 Train: 0.9786, Val: 0.8220, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0810, Loss: 0.7618 Train: 0.9786, Val: 0.8240, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0811, Loss: 0.7426 Train: 0.9786, Val: 0.8240, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0812, Loss: 0.7620 Train: 0.9786, Val: 0.8180, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0813, Loss: 0.7728 Train: 0.9786, Val: 0.8200, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0814, Loss: 0.8457 Train: 0.9786, Val: 0.8160, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0815, Loss: 0.7933 Train: 0.9786, Val: 0.8160, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0816, Loss: 0.7356 Train: 0.9786, Val: 0.8180, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0817, Loss: 0.7269 Train: 0.9786, Val: 0.8180, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0818, Loss: 0.9172 Train: 0.9786, Val: 0.8180, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0819, Loss: 0.7765 Train: 0.9786, Val: 0.8180, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0820, Loss: 0.8859 Train: 0.9786, Val: 0.8260, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0821, Loss: 0.7400 Train: 0.9786, Val: 0.8280, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0822, Loss: 0.7770 Train: 0.9786, Val: 0.8260, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0823, Loss: 0.7471 Train: 0.9786, Val: 0.8360, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0824, Loss: 0.7852 Train: 0.9714, Val: 0.8280, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0825, Loss: 0.7256 Train: 0.9714, Val: 0.8340, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0826, Loss: 0.7622 Train: 0.9714, Val: 0.8340, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0827, Loss: 0.7843 Train: 0.9714, Val: 0.8320, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0828, Loss: 0.7462 Train: 0.9786, Val: 0.8280, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0829, Loss: 0.7657 Train: 0.9786, Val: 0.8280, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0830, Loss: 0.7290 Train: 0.9786, Val: 0.8240, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0831, Loss: 0.8552 Train: 0.9786, Val: 0.8220, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0832, Loss: 0.7643 Train: 0.9786, Val: 0.8220, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0833, Loss: 0.7566 Train: 0.9786, Val: 0.8180, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0834, Loss: 0.7718 Train: 0.9786, Val: 0.8200, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0835, Loss: 0.7800 Train: 0.9786, Val: 0.8200, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0836, Loss: 0.7224 Train: 0.9786, Val: 0.8280, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0837, Loss: 0.8878 Train: 0.9714, Val: 0.8240, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0838, Loss: 0.7512 Train: 0.9714, Val: 0.8280, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0839, Loss: 0.7812 Train: 0.9714, Val: 0.8260, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0840, Loss: 0.8135 Train: 0.9714, Val: 0.8260, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0841, Loss: 0.8537 Train: 0.9714, Val: 0.8280, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0842, Loss: 0.8084 Train: 0.9714, Val: 0.8260, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0843, Loss: 0.7341 Train: 0.9714, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0844, Loss: 0.7216 Train: 0.9714, Val: 0.8240, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0845, Loss: 0.8142 Train: 0.9714, Val: 0.8260, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0846, Loss: 0.7430 Train: 0.9714, Val: 0.8300, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0847, Loss: 0.9429 Train: 0.9714, Val: 0.8280, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0848, Loss: 0.8023 Train: 0.9714, Val: 0.8280, Test: 0.8530, Final Test: 0.8460\n",
"Epoch: 0849, Loss: 0.7681 Train: 0.9714, Val: 0.8260, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0850, Loss: 0.7976 Train: 0.9714, Val: 0.8240, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0851, Loss: 0.8061 Train: 0.9714, Val: 0.8280, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0852, Loss: 0.7416 Train: 0.9714, Val: 0.8260, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0853, Loss: 0.7646 Train: 0.9714, Val: 0.8340, Test: 0.8540, Final Test: 0.8460\n",
"Epoch: 0854, Loss: 0.7490 Train: 0.9714, Val: 0.8320, Test: 0.8550, Final Test: 0.8460\n",
"Epoch: 0855, Loss: 0.7934 Train: 0.9714, Val: 0.8320, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0856, Loss: 0.8532 Train: 0.9714, Val: 0.8280, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0857, Loss: 0.7620 Train: 0.9714, Val: 0.8320, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0858, Loss: 0.7481 Train: 0.9714, Val: 0.8340, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0859, Loss: 0.9292 Train: 0.9714, Val: 0.8300, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0860, Loss: 0.7924 Train: 0.9714, Val: 0.8320, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0861, Loss: 0.6909 Train: 0.9714, Val: 0.8320, Test: 0.8460, Final Test: 0.8460\n",
"Epoch: 0862, Loss: 0.7772 Train: 0.9714, Val: 0.8320, Test: 0.8450, Final Test: 0.8460\n",
"Epoch: 0863, Loss: 0.6778 Train: 0.9714, Val: 0.8320, Test: 0.8440, Final Test: 0.8460\n",
"Epoch: 0864, Loss: 0.7570 Train: 0.9714, Val: 0.8320, Test: 0.8470, Final Test: 0.8460\n",
"Epoch: 0865, Loss: 0.7621 Train: 0.9714, Val: 0.8360, Test: 0.8480, Final Test: 0.8460\n",
"Epoch: 0866, Loss: 0.7644 Train: 0.9714, Val: 0.8340, Test: 0.8500, Final Test: 0.8460\n",
"Epoch: 0867, Loss: 0.7585 Train: 0.9714, Val: 0.8340, Test: 0.8520, Final Test: 0.8460\n",
"Epoch: 0868, Loss: 0.7721 Train: 0.9714, Val: 0.8300, Test: 0.8510, Final Test: 0.8460\n",
"Epoch: 0869, Loss: 0.7174 Train: 0.9714, Val: 0.8300, Test: 0.8560, Final Test: 0.8460\n",
"Epoch: 0870, Loss: 0.7429 Train: 0.9714, Val: 0.8300, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0871, Loss: 0.7807 Train: 0.9714, Val: 0.8260, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0872, Loss: 0.6733 Train: 0.9714, Val: 0.8280, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0873, Loss: 0.7525 Train: 0.9714, Val: 0.8260, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0874, Loss: 0.7787 Train: 0.9714, Val: 0.8240, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0875, Loss: 0.6832 Train: 0.9714, Val: 0.8280, Test: 0.8620, Final Test: 0.8460\n",
"Epoch: 0876, Loss: 0.7952 Train: 0.9714, Val: 0.8280, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0877, Loss: 0.6858 Train: 0.9714, Val: 0.8260, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0878, Loss: 0.6546 Train: 0.9714, Val: 0.8240, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0879, Loss: 0.7435 Train: 0.9714, Val: 0.8260, Test: 0.8610, Final Test: 0.8460\n",
"Epoch: 0880, Loss: 0.7055 Train: 0.9786, Val: 0.8280, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0881, Loss: 0.7264 Train: 0.9857, Val: 0.8300, Test: 0.8600, Final Test: 0.8460\n",
"Epoch: 0882, Loss: 0.7906 Train: 0.9857, Val: 0.8260, Test: 0.8570, Final Test: 0.8460\n",
"Epoch: 0883, Loss: 0.9476 Train: 0.9857, Val: 0.8280, Test: 0.8580, Final Test: 0.8460\n",
"Epoch: 0884, Loss: 1.3359 Train: 0.9857, Val: 0.8320, Test: 0.8590, Final Test: 0.8460\n",
"Epoch: 0885, Loss: 0.7516 Train: 0.9786, Val: 0.8340, Test: 0.8630, Final Test: 0.8460\n",
"Epoch: 0886, Loss: 0.7547 Train: 0.9786, Val: 0.8400, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0887, Loss: 0.8893 Train: 0.9786, Val: 0.8420, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0888, Loss: 0.8257 Train: 0.9786, Val: 0.8320, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0889, Loss: 0.7234 Train: 0.9786, Val: 0.8320, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0890, Loss: 0.7820 Train: 0.9786, Val: 0.8280, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0891, Loss: 0.7591 Train: 0.9786, Val: 0.8280, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0892, Loss: 0.8116 Train: 0.9786, Val: 0.8280, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0893, Loss: 0.7138 Train: 0.9786, Val: 0.8260, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0894, Loss: 0.7296 Train: 0.9786, Val: 0.8280, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0895, Loss: 0.7800 Train: 0.9786, Val: 0.8280, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0896, Loss: 0.7616 Train: 0.9786, Val: 0.8280, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0897, Loss: 0.7564 Train: 0.9786, Val: 0.8300, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0898, Loss: 0.7910 Train: 0.9786, Val: 0.8300, Test: 0.8570, Final Test: 0.8610\n",
"Epoch: 0899, Loss: 0.7664 Train: 0.9786, Val: 0.8320, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0900, Loss: 0.8204 Train: 0.9786, Val: 0.8340, Test: 0.8620, Final Test: 0.8610\n",
"Epoch: 0901, Loss: 0.6911 Train: 0.9786, Val: 0.8340, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0902, Loss: 0.7677 Train: 0.9786, Val: 0.8320, Test: 0.8590, Final Test: 0.8610\n",
"Epoch: 0903, Loss: 0.7609 Train: 0.9786, Val: 0.8260, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0904, Loss: 0.6986 Train: 0.9786, Val: 0.8300, Test: 0.8570, Final Test: 0.8610\n",
"Epoch: 0905, Loss: 0.8179 Train: 0.9786, Val: 0.8220, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0906, Loss: 0.7210 Train: 0.9786, Val: 0.8280, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0907, Loss: 0.7987 Train: 0.9786, Val: 0.8280, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0908, Loss: 0.8916 Train: 0.9786, Val: 0.8240, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0909, Loss: 0.7364 Train: 0.9786, Val: 0.8280, Test: 0.8590, Final Test: 0.8610\n",
"Epoch: 0910, Loss: 0.7862 Train: 0.9786, Val: 0.8280, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0911, Loss: 0.7608 Train: 0.9786, Val: 0.8300, Test: 0.8570, Final Test: 0.8610\n",
"Epoch: 0912, Loss: 0.7472 Train: 0.9786, Val: 0.8240, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0913, Loss: 0.7003 Train: 0.9786, Val: 0.8260, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0914, Loss: 0.7020 Train: 0.9786, Val: 0.8260, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0915, Loss: 0.8001 Train: 0.9786, Val: 0.8240, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0916, Loss: 0.7462 Train: 0.9786, Val: 0.8180, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0917, Loss: 0.9884 Train: 0.9786, Val: 0.8220, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0918, Loss: 0.7307 Train: 0.9786, Val: 0.8220, Test: 0.8510, Final Test: 0.8610\n",
"Epoch: 0919, Loss: 0.7638 Train: 0.9786, Val: 0.8180, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0920, Loss: 0.7534 Train: 0.9786, Val: 0.8140, Test: 0.8570, Final Test: 0.8610\n",
"Epoch: 0921, Loss: 0.7497 Train: 0.9786, Val: 0.8220, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0922, Loss: 0.8384 Train: 0.9786, Val: 0.8240, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0923, Loss: 0.8422 Train: 0.9786, Val: 0.8260, Test: 0.8490, Final Test: 0.8610\n",
"Epoch: 0924, Loss: 0.7389 Train: 0.9714, Val: 0.8220, Test: 0.8440, Final Test: 0.8610\n",
"Epoch: 0925, Loss: 0.7256 Train: 0.9714, Val: 0.8220, Test: 0.8400, Final Test: 0.8610\n",
"Epoch: 0926, Loss: 0.7992 Train: 0.9786, Val: 0.8220, Test: 0.8400, Final Test: 0.8610\n",
"Epoch: 0927, Loss: 0.8505 Train: 0.9786, Val: 0.8220, Test: 0.8410, Final Test: 0.8610\n",
"Epoch: 0928, Loss: 0.8296 Train: 0.9786, Val: 0.8220, Test: 0.8440, Final Test: 0.8610\n",
"Epoch: 0929, Loss: 0.8660 Train: 0.9786, Val: 0.8260, Test: 0.8480, Final Test: 0.8610\n",
"Epoch: 0930, Loss: 0.6306 Train: 0.9786, Val: 0.8280, Test: 0.8510, Final Test: 0.8610\n",
"Epoch: 0931, Loss: 0.7496 Train: 0.9786, Val: 0.8220, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0932, Loss: 0.8062 Train: 0.9786, Val: 0.8220, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0933, Loss: 0.7259 Train: 0.9786, Val: 0.8240, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0934, Loss: 0.8215 Train: 0.9857, Val: 0.8220, Test: 0.8570, Final Test: 0.8610\n",
"Epoch: 0935, Loss: 0.7315 Train: 0.9857, Val: 0.8180, Test: 0.8590, Final Test: 0.8610\n",
"Epoch: 0936, Loss: 0.7760 Train: 0.9857, Val: 0.8180, Test: 0.8570, Final Test: 0.8610\n",
"Epoch: 0937, Loss: 0.7674 Train: 0.9857, Val: 0.8220, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0938, Loss: 0.7856 Train: 0.9857, Val: 0.8280, Test: 0.8590, Final Test: 0.8610\n",
"Epoch: 0939, Loss: 0.7508 Train: 0.9857, Val: 0.8280, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0940, Loss: 0.7365 Train: 0.9857, Val: 0.8240, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0941, Loss: 0.8835 Train: 0.9857, Val: 0.8260, Test: 0.8620, Final Test: 0.8610\n",
"Epoch: 0942, Loss: 0.8380 Train: 0.9857, Val: 0.8260, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0943, Loss: 0.9064 Train: 0.9857, Val: 0.8260, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0944, Loss: 0.7587 Train: 0.9857, Val: 0.8260, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0945, Loss: 0.8160 Train: 0.9857, Val: 0.8260, Test: 0.8570, Final Test: 0.8610\n",
"Epoch: 0946, Loss: 0.8342 Train: 0.9857, Val: 0.8220, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0947, Loss: 0.7998 Train: 0.9857, Val: 0.8240, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0948, Loss: 0.7898 Train: 0.9857, Val: 0.8260, Test: 0.8510, Final Test: 0.8610\n",
"Epoch: 0949, Loss: 0.8209 Train: 0.9786, Val: 0.8280, Test: 0.8480, Final Test: 0.8610\n",
"Epoch: 0950, Loss: 0.7304 Train: 0.9786, Val: 0.8320, Test: 0.8490, Final Test: 0.8610\n",
"Epoch: 0951, Loss: 0.7146 Train: 0.9786, Val: 0.8340, Test: 0.8500, Final Test: 0.8610\n",
"Epoch: 0952, Loss: 0.7516 Train: 0.9786, Val: 0.8340, Test: 0.8490, Final Test: 0.8610\n",
"Epoch: 0953, Loss: 0.7577 Train: 0.9786, Val: 0.8320, Test: 0.8490, Final Test: 0.8610\n",
"Epoch: 0954, Loss: 0.7249 Train: 0.9786, Val: 0.8320, Test: 0.8510, Final Test: 0.8610\n",
"Epoch: 0955, Loss: 0.8455 Train: 0.9786, Val: 0.8380, Test: 0.8510, Final Test: 0.8610\n",
"Epoch: 0956, Loss: 0.8390 Train: 0.9786, Val: 0.8380, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0957, Loss: 0.7508 Train: 0.9786, Val: 0.8380, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0958, Loss: 0.7328 Train: 0.9786, Val: 0.8400, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0959, Loss: 0.7524 Train: 0.9786, Val: 0.8360, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0960, Loss: 0.9062 Train: 0.9857, Val: 0.8320, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0961, Loss: 0.8332 Train: 0.9786, Val: 0.8280, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0962, Loss: 0.7729 Train: 0.9786, Val: 0.8220, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0963, Loss: 0.7170 Train: 0.9786, Val: 0.8240, Test: 0.8520, Final Test: 0.8610\n",
"Epoch: 0964, Loss: 0.7140 Train: 0.9786, Val: 0.8220, Test: 0.8500, Final Test: 0.8610\n",
"Epoch: 0965, Loss: 0.8222 Train: 0.9786, Val: 0.8240, Test: 0.8520, Final Test: 0.8610\n",
"Epoch: 0966, Loss: 0.7765 Train: 0.9714, Val: 0.8180, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0967, Loss: 0.7350 Train: 0.9714, Val: 0.8200, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0968, Loss: 0.7587 Train: 0.9714, Val: 0.8200, Test: 0.8540, Final Test: 0.8610\n",
"Epoch: 0969, Loss: 0.7366 Train: 0.9714, Val: 0.8140, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0970, Loss: 0.7815 Train: 0.9714, Val: 0.8160, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0971, Loss: 0.7004 Train: 0.9786, Val: 0.8180, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0972, Loss: 0.7962 Train: 0.9786, Val: 0.8160, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0973, Loss: 0.7910 Train: 0.9786, Val: 0.8140, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0974, Loss: 0.7041 Train: 0.9786, Val: 0.8120, Test: 0.8550, Final Test: 0.8610\n",
"Epoch: 0975, Loss: 0.7209 Train: 0.9786, Val: 0.8200, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0976, Loss: 0.6438 Train: 0.9786, Val: 0.8240, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0977, Loss: 0.7732 Train: 0.9786, Val: 0.8240, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0978, Loss: 0.7421 Train: 0.9786, Val: 0.8320, Test: 0.8620, Final Test: 0.8610\n",
"Epoch: 0979, Loss: 0.7616 Train: 0.9786, Val: 0.8320, Test: 0.8580, Final Test: 0.8610\n",
"Epoch: 0980, Loss: 0.7865 Train: 0.9786, Val: 0.8260, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0981, Loss: 0.7989 Train: 0.9786, Val: 0.8280, Test: 0.8460, Final Test: 0.8610\n",
"Epoch: 0982, Loss: 0.7485 Train: 0.9786, Val: 0.8260, Test: 0.8430, Final Test: 0.8610\n",
"Epoch: 0983, Loss: 0.7651 Train: 0.9857, Val: 0.8280, Test: 0.8460, Final Test: 0.8610\n",
"Epoch: 0984, Loss: 0.7157 Train: 0.9857, Val: 0.8280, Test: 0.8470, Final Test: 0.8610\n",
"Epoch: 0985, Loss: 0.7292 Train: 0.9857, Val: 0.8240, Test: 0.8490, Final Test: 0.8610\n",
"Epoch: 0986, Loss: 0.7714 Train: 0.9857, Val: 0.8260, Test: 0.8530, Final Test: 0.8610\n",
"Epoch: 0987, Loss: 0.7175 Train: 0.9857, Val: 0.8220, Test: 0.8560, Final Test: 0.8610\n",
"Epoch: 0988, Loss: 0.7104 Train: 0.9857, Val: 0.8160, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0989, Loss: 0.8381 Train: 0.9857, Val: 0.8180, Test: 0.8590, Final Test: 0.8610\n",
"Epoch: 0990, Loss: 0.7424 Train: 0.9857, Val: 0.8180, Test: 0.8590, Final Test: 0.8610\n",
"Epoch: 0991, Loss: 0.7286 Train: 0.9857, Val: 0.8220, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0992, Loss: 0.6973 Train: 0.9857, Val: 0.8200, Test: 0.8600, Final Test: 0.8610\n",
"Epoch: 0993, Loss: 0.8375 Train: 0.9857, Val: 0.8280, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0994, Loss: 0.7316 Train: 0.9786, Val: 0.8260, Test: 0.8610, Final Test: 0.8610\n",
"Epoch: 0995, Loss: 0.7491 Train: 0.9786, Val: 0.8240, Test: 0.8630, Final Test: 0.8610\n",
"Epoch: 0996, Loss: 0.7430 Train: 0.9714, Val: 0.8200, Test: 0.8620, Final Test: 0.8610\n",
"Epoch: 0997, Loss: 0.7419 Train: 0.9714, Val: 0.8200, Test: 0.8630, Final Test: 0.8610\n",
"Epoch: 0998, Loss: 0.7173 Train: 0.9714, Val: 0.8220, Test: 0.8620, Final Test: 0.8610\n",
"Epoch: 0999, Loss: 0.8858 Train: 0.9714, Val: 0.8220, Test: 0.8630, Final Test: 0.8610\n",
"Epoch: 1000, Loss: 0.7241 Train: 0.9714, Val: 0.8240, Test: 0.8570, Final Test: 0.8610\n",
"CPU times: user 1min 16s, sys: 761 ms, total: 1min 17s\n",
"Wall time: 1min 18s\n"
]
}
]
},
{
"cell_type": "code",
"source": [
""
],
"metadata": {
"id": "kguVinK_TEUn"
},
"execution_count": null,
"outputs": []
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment