Skip to content

Instantly share code, notes, and snippets.

View yassineAlouini's full-sized avatar
⚙️
PyTorch Exploration...

Yassine Alouini yassineAlouini

⚙️
PyTorch Exploration...
View GitHub Profile
@yassineAlouini
yassineAlouini / instructions.md
Last active August 28, 2018 18:46
Publish to PyPi
  1. Update the version in the setup.py file.

  2. Run: python3 setup.py sdist bdist_wheel.

  3. Run the following: python setup.py sdist upload -r pypi.

Notice that this last command is deprecated. Use twine instead.

@yassineAlouini
yassineAlouini / per_hour_plot.py
Last active September 4, 2018 08:59
One tick per hour matplotlib plot (using pandas)
import pandas as pd
import matplotlib.pylab as plt
import matplotlib.dates as mdates
hours = mdates.HourLocator(interval = 1)
h_d_fmt = mdates.DateFormatter('%d-%m %H:%M:%S')
DATA_PATH = "/path/to/your/data"
TMS_COL = "timestamp_column"
COL_TO_PLOT = "column_to_plot"
@yassineAlouini
yassineAlouini / break_per_year.py
Created October 4, 2018 08:20
Break a (large) CSV file into various ones per year.
import pandas as pd
INPUT_PATH = "your/input/path.csv"
OUPUT_PATH = "your/output/path_{}.csv"
df = pd.read_csv(INPUT_PATH, parse_dates=['tms_gmt'])
df['year'] = df.tms_gmt.dt.year
for year in df['year'].unique():
@yassineAlouini
yassineAlouini / get_data_from_url.py
Created October 25, 2018 15:27
Get GeoJSON data from a URL.
import geopandas as gpd
import requests
def get_data_from_url(url):
data = requets.get(url).json()
return gpd.GeoDataFrame.from_features(data)
@yassineAlouini
yassineAlouini / deep_meaning_break_down.py
Created November 18, 2018 12:03
Bar plot of the break down of my deep learning box
import pandas as pd
import matplotlib.pylab as plt
import seaborn as sns
# In the clipboard
# piece,price
# gpu,869
# ssd,140
# power,113
@yassineAlouini
yassineAlouini / hyperopt_graphs.py
Last active December 9, 2018 17:13
Hyperopt graphs
from hyperopt import tpe, fmin, Trials
from hyperopt.hp import normal
from hyperopt.plotting import main_plot_history, main_plot_histogram
import pandas as pd
import matplotlib.pylab as plt
def rosenbrock(suggestion):
"""
A test function to minimize using hyperopt. The
@yassineAlouini
yassineAlouini / add_user_airflow.py
Created January 11, 2019 10:49
Create Airflow users
# Extracted from this blog post: https://tech.marksblogg.com/install-and-configure-apache-airflow.html.
import airflow
from airflow import models, settings
from airflow.contrib.auth.backends.password_auth import PasswordUser
user = PasswordUser(models.User())
user.username = 'username'
user.email = '[email protected]'
@yassineAlouini
yassineAlouini / pretrained_unet.py
Created January 13, 2019 12:20
A use case for getting a pretrained Unet with ResNet34 and trained on the ImageNet dataset
# To get the segmentation_models library, run:
# pip install segmentation-models
from segmentation_models import Unet
def build_pretrained_unet_model():
"""Build a pre-trained Unet model. """
return Unet(backbone_name='resnet34', encoder_weights='imagenet')
@yassineAlouini
yassineAlouini / smaller_historical_transactions.py
Last active January 27, 2019 11:48
Smaller historical transactions DataFrame for the ELO competition.
# This function could be made generic to almost any loaded CSV file with
# pandas. Can you see how to do it?
import pandas as pd
# Some constants
PARQUET_ENGINE = "pyarrow"
DATE_COL = "purchase_date"
CATEGORICAL_COLS = ["card_id", "category_3", "merchant_id", "month_lag",
"installments", "state_id", "subsector_id",
@yassineAlouini
yassineAlouini / get_dbs_size.sql
Created February 12, 2019 08:46
Get the size of various PSQL databases
SELECT t1.datname AS db_name,
Pg_size_pretty(Pg_database_size(t1.datname)) AS db_size
FROM pg_database t1
ORDER BY Pg_database_size(t1.datname) DESC