Skip to content

Instantly share code, notes, and snippets.

View lejarx's full-sized avatar
🌴
On vacation

lejarX lejarx

🌴
On vacation
View GitHub Profile
@lejarx
lejarx / 0_reuse_code.js
Created April 24, 2014 11:34
Here are some things you can do with Gists in GistBox.
// Use Gists to store code you would like to remember later on
console.log(window); // log the "window" object to the console
@lejarx
lejarx / mappingFlows.R
Created December 20, 2015 08:50 — forked from oscarperpinan/mappingFlows.R
An alternative implementation of "Mapping Flows in R" (http://spatial.ly/2015/03/mapping-flows/) using `data.table` and `lattice`
### DATA SECTION
library(data.table)
## Read data with 'data.table::fread'
input <- fread("wu03ew_v1.csv", select = 1:3)
setnames(input, 1:3, new = c("origin", "destination","total"))
## Coordinates
centroids <- fread("msoa_popweightedcentroids.csv")
## 'Code' is the key to be used in the joins
@lejarx
lejarx / googleVis_with_knitr_and_RStudio.Rmd
Created January 6, 2016 06:12 — forked from mages/googleVis_with_knitr_and_RStudio.Rmd
Interactive reports in R with knitr and RStudio
My first examples with [***knitr***](http://yihui.name/knitr/)
-----------------------------------------
Let's include some simple R code:
```{r}
1+2
```
That worked.
Let's include a plot:
```{r fig.width=4, fig.height=4}
library(ggplot2);
library(grid);
data(iris)
x <- jitter(iris[,c('Sepal.Length')])
y <- jitter(iris[,c('Sepal.Width')])
z <- factor(iris[,c('Species')])
# The color blind palette without black:
library(idbr) # devtools::install_github('walkerke/idbr')
library(ggplot2)
library(animation)
library(dplyr)
library(ggthemes)
idb_api_key("Put your Census API key here")
male <- idb1('CH', 2016:2050, sex = 'male') %>%
mutate(POP = POP * -1,
@lejarx
lejarx / mapper_tutorial.py
Created August 21, 2019 15:08 — forked from ahadsheriff/mapper_tutorial.py
Crawl and scrape URLs to map a website
from bs4 import BeautifulSoup
import requests
import requests.exceptions
from urllib.parse import urlsplit
from urllib.parse import urlparse
from collections import deque
import re
url = "https://scrapethissite.com"
# a queue of urls to be crawled
library(RemixAutoML)
library(data.table)
###########################################
# Prepare data for AutoTS()----
###########################################
# Load Walmart Data from Dropbox----
data <- data.table::fread("https://www.dropbox.com/s/2str3ek4f4cheqi/walmart_train.csv?dl=1")