A bash script that turns mpv into a radio.
- mpv: to play the stream
- dmenu: to choose radio stations
- jq: to parse IPC responses from mpv
# An example to get the remaining rate limit using the Github GraphQL API. | |
import requests | |
headers = {"Authorization": "Bearer YOUR API KEY"} | |
def run_query(query): # A simple function to use requests.post to make the API call. Note the json= section. | |
request = requests.post('https://api.github.com/graphql', json={'query': query}, headers=headers) | |
if request.status_code == 200: |
#!/usr/bin/env bash | |
# Evan Wilde <[email protected]> | |
# July 20, 2017 | |
# defaults | |
user="postgres" | |
passwd="" | |
host="localhost" | |
db="ghtorrent" | |
tmpdir='/tmp' |
wl(){ | |
local ssid | |
local conn | |
nmcli device wifi rescan > /dev/null | |
ssid=$(nmcli device wifi list | tail -n +2 | grep -v '^ *\B--\B' | fzf -m | sed 's/^ *\*//' | awk '{print $1}') | |
if [ "x$ssid" != "x" ]; then | |
# check if the SSID has already a connection setup | |
conn=$(nmcli con | grep "$ssid" | awk '{print $1}' | uniq) |
export NEO4J_HOME=${NEO4J_HOME-~/Downloads/neo4j-community-3.0.1} | |
if [ ! -f data-csv.zip ]; then | |
curl -OL https://cloudfront-files-1.publicintegrity.org/offshoreleaks/data-csv.zip | |
fi | |
export DATA=${PWD}/import | |
rm -rf $DATA |
install_from_source <- function (pkg, args = NULL, repos = "http://cran.rstudio.com", overwrite = FALSE, ...) { | |
if (!overwrite) { | |
if (pkg %in% installed.packages()) { | |
message(pkg, " is already installed") | |
return() | |
} | |
} | |
install.packages( |
A quick example of doing data wrangling from the command-line, as well as getting to know one of San Francisco's data sets: the San Francisco restaurant inspections, courtesy of the SF Department of Public Health. I don't normally do database work from the command-line, but importing bulk data into SQLite is pretty frustrating using the available GUIs or just the shell.
So thank goodness for Christopher Groskopf's csvkit, a suite of Unix-like tools that use Python to robustly handle CSV files. There's a lot of great tools in csvkit, but for this gist, I just use csvsql, which can parse a CSV and turn it into properly-flavored SQL to pass directly into your database app of choice.