I hereby claim:
- I am zarkopafilis on github.
- I am zarkopafilis (https://keybase.io/zarkopafilis) on keybase.
- I have a public key ASCKdAxEYonhZm-gAgJrlXnAiKzzmF54I3FIVvrXqQM94go
To claim this, I am signing this object:
| CATEGORICAL_FEATURE_KEYS = [ | |
| 'workclass', | |
| 'education', | |
| 'marital-status', | |
| 'occupation', | |
| 'relationship', | |
| '...' | |
| NUMERIC_FEATURE_KEYS = [ | |
| 'age', | |
| 'capital-gain', |
| import numpy as np | |
| # 'Raw Data' | |
| ds = np.array([[3, 5, 7, 10], | |
| [12, 14, 16, 21]]) | |
| # pass 1 - calculate mean | |
| mean, std = np.mean(ds), np.std(ds) | |
| # pass 2 - normalise | |
| ds_for_train = (ds - mean) / std |
| # 'Raw Data' | |
| ds = ['I am writing articles on medium.', 'Medium is good as a platform.'] | |
| # pass 1 - lowercase, strip punctuation | |
| ds = [x.lower().replace('.', '') for x in dataset] | |
| # ['i am writing articles on medium', 'medium is good as a platform'] | |
| # pass 2 - tokenize | |
| ds_tok = [x.split(' ') for x in ds] |
| encoder_model = Model(encoder_input, encoder_states) | |
| decoder_state_input_h = Input(shape=(latent_dim,)) | |
| decoder_state_input_c = Input(shape=(latent_dim,)) | |
| decoder_states_inputs = [decoder_state_input_h, decoder_state_input_c] | |
| decoder_output, state_h, state_c = decoder_lstm_layer( | |
| decoder_embedding, | |
| initial_state=decoder_states_inputs) |
| SSD - https://www.skroutz.gr/s/13867031/Samsung-860-Evo-250GB.html | |
| HDD - https://www.skroutz.gr/s/2195774/Western-Digital-Blue-1TB-7200rpm.html | |
| RAM - https://www.skroutz.gr/s/19073883/Corsair-Vengeance-LPX-16GB-DDR4-3600MHz-CMK16GX4M2D3600C18.html | |
| MOBO - https://www.skroutz.gr/s/17143363/Gigabyte-B450M-S2H.html#specs | |
| CPU - https://www.skroutz.gr/s/19344792/AMD-Ryzen-3-3200G-Box.html | |
| PSU - https://www.skroutz.gr/s/11542607/Corsair-CX-Series-CX450.html | |
| Case - https://www.skroutz.gr/s/12232137/Cougar-MX330-X.html |
| Starting... | |
| Loading fasttext model... | |
| Loaded 2000000 words, with 300 vector length encoding. | |
| > Preparing Dataset Loader | |
| Loaded 172526 messages | |
| > Making Model | |
| > Training start | |
| Traceback (most recent call last): | |
| File "C:\Users\zarkopafilis\Desktop\implybot\testarch.py", line 36, in <module> | |
| train_loss = train_util.train(model, train_iterator, optimizer, criterion, CLIP) |
I hereby claim:
To claim this, I am signing this object:
| from __future__ import unicode_literals, print_function, division | |
| from io import open | |
| import fasttext | |
| import unicodedata | |
| import string | |
| import re | |
| import random | |
| import torch |
| import os | |
| import numpy as np | |
| import pandas as pd | |
| from keras.models import Sequential | |
| from keras.layers import Dense | |
| from keras.utils import to_categorical | |
| features = ["Pclass", "Sex", "SibSp", "Parch"] |
| <template> | |
| <Layout> | |
| <div style="text-align: center;"> | |
| <br /><br /> | |
| <h1>Page not found</h1><br /><br /> | |
| <g-image src="/shuffleparrot.gif"/><br /><br /> | |
| <h3>Bad url or under construction</h3> | |
| </div> | |
| </Layout> | |
| </template> |