Skip to content

Instantly share code, notes, and snippets.

View shaabhishek's full-sized avatar

Abhishek Sharma shaabhishek

View GitHub Profile
@shaabhishek
shaabhishek / environment.yml
Created November 8, 2019 19:47
Default conda environment yaml file
name: environment-name
channels:
- pytorch
- defaults
- anaconda
dependencies:
- numpy
- pandas
- pytorch
# - torchvision
@shaabhishek
shaabhishek / client_side.py
Created November 12, 2019 21:11
dense_rank use case for assigning the visit_counter to each hospital visit for each patient
## imports and other stuff
# Boilerplate for setting things up
query_args = {'dbname': dbname, 'host': host, 'port': port, 'user': dbusername}
conn = psycopg2.connect(**query_args)
cur = conn.cursor()
cur.execute('SET search_path to ' + schema_name)
# Setup the query- note it is ALMOST the same as the sql file's query
query = \
@shaabhishek
shaabhishek / bidirectional_rnn.py
Created March 4, 2020 19:21
Gist to verify 1. how packing and padding sequences works, and 2. how bidirectional rnns work
#questions to answer
# 1. Do the following sequences give the same hidden states: pad-rnn-hidden & pack-rnn-pad-hidden
# 2. does the birnn use the (hidden states computed in the forward states) to
# compute the (hidden states computed in the backward direction)
import torch
import torch.nn as nn
import torch.nn.utils.rnn as rnnutils
@shaabhishek
shaabhishek / mystyle.sty
Last active February 21, 2022 21:01
Style file for standard latex documents
\ProvidesPackage{mystyle}
\usepackage[utf8]{inputenc}
\usepackage{amsmath, amssymb, bm}
\usepackage{mathtools} % for DeclarePairedDelimiter command
\usepackage{graphicx}
\usepackage{subcaption} %allows drawing subfigures
% Set page size and margins
% Replace `letterpaper' with`a4paper' for UK/EU standard size
@shaabhishek
shaabhishek / batch_file_download.py
Created April 26, 2020 15:51
Script to download files in batch mode
from urllib.request import urlretrieve
from pathlib import Path
import time
import random
slides_dir = Path('/blablabla/CS590M/slides')
url = lambda n: f"https://blabla.com/slecture{n:02}h.pdf"
fname = lambda n: slides_dir / f"slecture{n:02}h.pdf"
@shaabhishek
shaabhishek / contour-plot-experiments.ipynb
Last active April 16, 2021 16:10
contour plot experiments
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@shaabhishek
shaabhishek / boilerplate.py
Created April 16, 2021 16:09
CmdStanPy on Colab Starter Code
########### INSTALL ###########
# !pip install --upgrade cmdstanpy
import os
import urllib.request
import shutil
# Install pre-built CmdStan binary
# (faster than compiling from source via install_cmdstan() function)
tgz_file = 'colab-cmdstan-2.23.0.tar.gz'
tgz_url = 'https://github.com/stan-dev/cmdstan/releases/download/v2.23.0/colab-cmdstan-2.23.0.tar.gz'
if not os.path.exists(tgz_file):

linux: sort -u filename to remove duplicates from file

linux: use LC_ALL=C x ... instead of x ... to speed up x in {grep, sed, awk, sort}

latex: use package and command bm for bold math symbols

linux: awk '{print $1}' your_file | sort | uniq | wc -l to get unique counts for a column in a file

slurm: sacct --starttime 2022-09-01 --format=User,JobID,Jobname,partition,state,time,start,end,elapsed,MaxRss,MaxVMSize,nnodes,ncpus,nodelist for list all jobs since a specific data

@shaabhishek
shaabhishek / psych_doc.md
Last active January 31, 2022 16:33
Important Methods/Attributes - Psych Project

DatasetAllocator:

  1. Helps create the flattened representation of the timeseries tensor from the raw files

DatasetAllocator.record: 1.


Step 1: Install Anaconda compatible with RHEL 6

The file I used was Anaconda3-2020.11-Linux-x86_64.sh from https://repo.anaconda.com/archive/

wget https://repo.anaconda.com/archive/Anaconda3-2020.11-Linux-x86_64.sh
bash Anaconda3-2020.11-Linux-x86_64.sh

Step 2: Create a new environment.

I tried lots of older and newer versions of python but 3.7.0 is the one that worked for me. The patchelf is necessary for the next step.