Skip to content

Instantly share code, notes, and snippets.

View anderser's full-sized avatar

Anders Eriksen anderser

View GitHub Profile
@anderser
anderser / base.html
Created March 16, 2015 08:24
Example base template
<!doctype html>
<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js" ng-app="prosjektnavnApp"> <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<title></title>
<meta name="description" content="">
@anderser
anderser / responses.json
Created August 10, 2015 07:01
Responses for given muni & party
{
"responses": [
{
"id": 455,
"responder_name": "Harald Schjelderup",
"responder_title": "Byrådslederkandidat",
"motivation": "Hei Heidi! Snegler er noe herk, men dette er nok ikke en sak for bystyret :) Lykke til med kampen i hagen! ",
"question": {
"id": 148,
"question": "Kan du love en handlingsplan mot brunsnegler? Hva går denne planen i så fall ut på?",
@anderser
anderser / run_sql.js
Created November 13, 2015 19:29
Small node command line script to run SQL against postgres and/or CartoDB API
#!/usr/bin/env node
'use strict';
var dir = require('node-dir');
var fs = require('fs');
var http = require('http');
var _ = require('lodash');
var CartoDB = require('cartodb');
var pg = require('pg');
@anderser
anderser / streaks.py
Last active January 19, 2016 20:32
Agate compute method to generate streaks of consecutive values in a dataset. WIP
import agate
class Streaks(agate.Computation):
"""
Computes the streaks of consecutive values in a column.
Each streak will be given an increasing
integer value so that you can group by this later to
find longest consecutive streak.
"""
@anderser
anderser / mydag.py
Created April 11, 2016 19:08
Error in Airflow DAG
from __future__ import print_function
from builtins import range
from airflow.operators import PythonOperator, PostgresOperator, DummyOperator
from airflow.models import DAG
from datetime import datetime, timedelta
import time
from pprint import pprint
seven_days_ago = datetime.combine(
@anderser
anderser / pipeline.md
Created April 12, 2016 18:53
A sample pipeline/DAG for Airflow
  1. Fetch some json/XML file from an external API
  2. Store the file on S3 (as a backup) with a file name with a i.e. a timestamp. Pass the filename on to next task
  3. Read the json/xml from S3 and into some table structure (pandas, agate etc) and change field types etc.
  4. Store the table in a postgres database in a temp table
  5. Compare the temp table to a "main" table and see if there are changes (some SQL diff). Find out which records have to be added/removed/updated in the "main" table.
  6. If nothing has changed, abort everything. If it has, pass on which records are new, deleted and updated.
  7. a) insert new records in main table, alert newsroom on slack of new items.b) delete items in main table not in temp-table. Alert via slack. c) Update records in main table, alert via slack
  8. The end
@anderser
anderser / boatswithin.sql
Created January 6, 2017 08:57
Båter ved oppdrettslokasjon på distinkte datoer (innenfor 200m)
WITH ais AS
(SELECT *
FROM aisdata.bronnbaater_resampled_min_lowspeed),
lokaliteter AS
(SELECT *
FROM fishandfjord.fiskeridir_alle_lokaliteter
WHERE loknr = '11763')
SELECT truncdate,
loknr,
lokalitet,
@anderser
anderser / abovefold.scss
Last active June 8, 2017 12:05 — forked from dlmr/main.html
Above fold style tag (css) using roc webapp react
body {
background-color: red;
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
{
trafficData(trafficRegistrationPointId: "72220V805744") {
trafficRegistrationPoint {
id,
name,
direction {
from,
to,
}
},