I hereby claim:
- I am ashander on github.
- I am ashander (https://keybase.io/ashander) on keybase.
- I have a public key ASB64liKYYs5Ynv_-hUY2oWx0u_J4KWK9CbksBDJliS7fgo
To claim this, I am signing this object:
I hereby claim:
To claim this, I am signing this object:
// 1. Go to https://twitter.com/i/likes | |
// 2. Keep scrolling to the bottom repeatedly until all your favs are loaded. | |
// 3. Run this in your console (open in chrome by View > Developer > JavaScript Console) | |
// Notes: this may take a while if you have a lot of favs/likes | |
// you can only access your most recent ~2000 likes. | |
// inspired by https://gist.github.com/JamieMason/7580315 | |
$('.ProfileTweet-actionButtonUndo').click() |
## example from http://adv-r.had.co.nz/S3.html | |
newmean <- function (x, ...) { | |
UseMethod("newmean", x) | |
} | |
newmean.numeric <- function(x, ...) sum(x) / length(x) | |
newmean.data.frame <- function(x, ...) sapply(x, mean, ...) | |
newmean.matrix <- function(x, ...) apply(x, 2, mean) | |
#numeric a | |
a <- c(1, 2, 6) |
https://git.overleaf.com/blahblahblaah
)set up overleaf as a remote
git remote add overleaf https://git.overleaf.com/blahblahblaah
git checkout -b collaboration overleaf/master
pkill iTerm
ps |grep tmux
kill -9 # for each [pid] correesponding to a `tmux -CC` process,
# you could use pkill instead if you got the pattern right?
# placeholder |
## urls were updated for gdam2 -- should go to gadm.org to update for each run of this | |
## http://gadm.org/country | |
## choose a country and R spdf from dropdown | |
## eg for | |
##for AFG sends POST req http://gadm.org/download?OK=OK&_submit_check=1&cnt=AFG_Afghanistan&thm=R%23R%20data | |
##returns page with list of download links (likely based on your location and data availability) | |
## part of the page looks like | |
## | |
## download |
## @knitr my-external-chunk | |
set.seed(111) | |
rnorm(100) |
# get data from html tables in ESA archive | |
# http://www.esapubs.org/archive/mono/M081/023/suppl-1.htm | |
library(XML) | |
root <- 'http://www.esapubs.org/archive/mono/M081/023' | |
tips <- c('limn.html', 'latitudes.html', 'fish.html', 'acronyms.html') | |
dat <- lapply(tips, function(t) | |
readHTMLTable(paste(root, t, sep='/'))) | |