Skip to content

Instantly share code, notes, and snippets.

View yassineAlouini's full-sized avatar
⚙️
PyTorch Exploration...

Yassine Alouini yassineAlouini

⚙️
PyTorch Exploration...
View GitHub Profile
@yassineAlouini
yassineAlouini / upload_image.py
Created February 24, 2019 13:23
Upload an image using Flask
import os
from flask import Flask, redirect, render_template, request, url_for
from werkzeug.utils import secure_filename
DATA_FOLDER_PATH = os.path.join( os.path.dirname(os.path.realpath(__file__)), 'static')
ALLOWED_EXTENSIONS = set(['.png', '.jpg', '.jpeg'])
IMG_MAX_SIZE = 16 * 1024 * 1024
app = Flask(__name__)
@yassineAlouini
yassineAlouini / predict_top_imagnet_labels.py
Created February 15, 2019 07:56
Predict the top Imagenet labels using a pre-trained Keras model.
import numpy as np
import keras.preprocessing.image as image_utils
from keras.applications.imagenet_utils import decode_predictions, preprocess_input
# You can import any other model if you would like to :)
from keras.applications.resnet50 import ResNet50
IMG_SIZE = (224, 224)
# Loading the model only once
CLASSIFICATION_MODEL = ResNet50()
@yassineAlouini
yassineAlouini / get_dbs_size.sql
Created February 12, 2019 08:46
Get the size of various PSQL databases
SELECT t1.datname AS db_name,
Pg_size_pretty(Pg_database_size(t1.datname)) AS db_size
FROM pg_database t1
ORDER BY Pg_database_size(t1.datname) DESC
@yassineAlouini
yassineAlouini / smaller_historical_transactions.py
Last active January 27, 2019 11:48
Smaller historical transactions DataFrame for the ELO competition.
# This function could be made generic to almost any loaded CSV file with
# pandas. Can you see how to do it?
import pandas as pd
# Some constants
PARQUET_ENGINE = "pyarrow"
DATE_COL = "purchase_date"
CATEGORICAL_COLS = ["card_id", "category_3", "merchant_id", "month_lag",
"installments", "state_id", "subsector_id",
@yassineAlouini
yassineAlouini / pretrained_unet.py
Created January 13, 2019 12:20
A use case for getting a pretrained Unet with ResNet34 and trained on the ImageNet dataset
# To get the segmentation_models library, run:
# pip install segmentation-models
from segmentation_models import Unet
def build_pretrained_unet_model():
"""Build a pre-trained Unet model. """
return Unet(backbone_name='resnet34', encoder_weights='imagenet')
@yassineAlouini
yassineAlouini / add_user_airflow.py
Created January 11, 2019 10:49
Create Airflow users
# Extracted from this blog post: https://tech.marksblogg.com/install-and-configure-apache-airflow.html.
import airflow
from airflow import models, settings
from airflow.contrib.auth.backends.password_auth import PasswordUser
user = PasswordUser(models.User())
user.username = 'username'
user.email = '[email protected]'
@yassineAlouini
yassineAlouini / hyperopt_graphs.py
Last active December 9, 2018 17:13
Hyperopt graphs
from hyperopt import tpe, fmin, Trials
from hyperopt.hp import normal
from hyperopt.plotting import main_plot_history, main_plot_histogram
import pandas as pd
import matplotlib.pylab as plt
def rosenbrock(suggestion):
"""
A test function to minimize using hyperopt. The
@yassineAlouini
yassineAlouini / deep_meaning_break_down.py
Created November 18, 2018 12:03
Bar plot of the break down of my deep learning box
import pandas as pd
import matplotlib.pylab as plt
import seaborn as sns
# In the clipboard
# piece,price
# gpu,869
# ssd,140
# power,113
@yassineAlouini
yassineAlouini / get_data_from_url.py
Created October 25, 2018 15:27
Get GeoJSON data from a URL.
import geopandas as gpd
import requests
def get_data_from_url(url):
data = requets.get(url).json()
return gpd.GeoDataFrame.from_features(data)
@yassineAlouini
yassineAlouini / break_per_year.py
Created October 4, 2018 08:20
Break a (large) CSV file into various ones per year.
import pandas as pd
INPUT_PATH = "your/input/path.csv"
OUPUT_PATH = "your/output/path_{}.csv"
df = pd.read_csv(INPUT_PATH, parse_dates=['tms_gmt'])
df['year'] = df.tms_gmt.dt.year
for year in df['year'].unique():