Skip to content

Instantly share code, notes, and snippets.

CREATE OR REPLACE PROCEDURE sp_update_ddl()
AS $$
BEGIN
alter table myschema.mytable add column newcolumn bigint;
EXCEPTION
WHEN OTHERS THEN
raise 'exception in filename.sql';
END
$$ LANGUAGE plpgsql
;
@seahrh
seahrh / bitbucket-pipelines.yml
Created January 13, 2021 09:47
Python starter bitbucket pipelines
definitions:
steps:
- step: &tests
image: python:3.7.9
script:
- pip install ."[tests]"
- mypy src
- pytest -vv --cov=src
caches:
- pip
@seahrh
seahrh / librosa_visualize_audio.py
Created January 12, 2021 10:16
1. Prints information about an audio singal, 2. plots the waveform, and 3. Creates player (Taken from https://www.audiolabs-erlangen.de/resources/MIR/FMP/B/B_PythonAudio.html)
import os
import numpy as np
from matplotlib import pyplot as plt
import IPython.display as ipd
import librosa
import pandas as pd
%matplotlib inline
def print_plot_play(x, Fs, text=''):
"""1. Prints information about an audio singal, 2. plots the waveform, and 3. Creates player
@seahrh
seahrh / librosa_visualize_audio.py
Created January 12, 2021 10:16
1. Prints information about an audio singal, 2. plots the waveform, and 3. Creates player
import os
import numpy as np
from matplotlib import pyplot as plt
import IPython.display as ipd
import librosa
import pandas as pd
%matplotlib inline
def print_plot_play(x, Fs, text=''):
"""1. Prints information about an audio singal, 2. plots the waveform, and 3. Creates player
@seahrh
seahrh / clip.py
Created December 11, 2020 00:55
Keras clipping custom metric. Adapted from https://neptune.ai/blog/keras-metrics
def recall(y_true, y_pred):
y_true = K.ones_like(y_true)
true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
all_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
recall = true_positives / (all_positives + K.epsilon())
return recall
def precision(y_true, y_pred):
y_true = K.ones_like(y_true)
@seahrh
seahrh / featureimportance.py
Created November 6, 2020 09:41
Put feature importance scores in dataframe
feature_important = clf.get_booster().get_score(importance_type="weight")
keys = list(feature_important.keys())
values = list(feature_important.values())
data = pd.DataFrame(data=values, index=keys, columns=["score"]).sort_values(by = "score", ascending=False)
# Top 10 features
data.head(20)
@seahrh
seahrh / showsortkey.sql
Created November 5, 2020 08:52
redshift show sortkey columns
-- see https://docs.aws.amazon.com/redshift/latest/dg/r_PG_TABLE_DEF.html
select "column", type, encoding, distkey, sortkey
from pg_table_def where schemaname='public' and tablename='mytable' and sortkey!=0;
@seahrh
seahrh / ModelCheckpointInGcs.py
Last active October 8, 2020 18:37
Extends the `keras.callbacks.ModelCheckpoint` callback to save checkpoints in Google Cloud Storage (GCS). Based on Tensorflow 2.3.
from tensorflow import keras
from tensorflow.python.lib.io import file_io
class ModelCheckpointInGcs(keras.callbacks.ModelCheckpoint):
def __init__(
self,
filepath,
gcs_dir: str,
monitor="val_loss",
@seahrh
seahrh / aws-template-bucket-custom-acl.yaml
Created September 23, 2020 10:25 — forked from christianklotz/aws-template-bucket-custom-acl.yaml
CloudFormation template to create S3 bucket resource with custom role
AWSTemplateFormatVersion: 2010-09-09
Transform: AWS::Serverless-2016-10-31
Parameters:
BucketPrefix:
Type: String
Description: "The prefix used for all S3 buckets."
AllowedPattern: "[a-z-]+"
Resources:
@seahrh
seahrh / connect_psycopg2_to_pandas.py
Created September 18, 2020 02:08 — forked from jakebrinkmann/connect_psycopg2_to_pandas.py
Read SQL query from psycopg2 into pandas dataframe
import pandas as pd
import pandas.io.sql as sqlio
import psycopg2
conn = psycopg2.connect("host='{}' port={} dbname='{}' user={} password={}".format(host, port, dbname, username, pwd))
sql = "select count(*) from table;"
dat = sqlio.read_sql_query(sql, conn)
conn = None