Skip to content

Instantly share code, notes, and snippets.

@lokal-profil
lokal-profil / index.html
Last active August 29, 2015 14:14
Running odyssey.js with the md as an external file
<!doctype><html><head>
<meta charset="utf-8">
<title>Import slides from external md file</title>
<meta name="description" content="">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="stylesheet" href="http://cartodb-libs.global.ssl.fastly.net/cartodb.js/v3/themes/css/cartodb.css">
<link rel="stylesheet" href="http://cartodb.github.io/odyssey.js/editor/css/slides.css">
<script type="text/javascript" src="//cdn.leafletjs.com/leaflet-0.7.3/leaflet.js"></script>
@lokal-profil
lokal-profil / fixBrokenLinks.py
Last active September 23, 2015 07:16
Script for replacing a list of broken links in both statements and references on Wikidata
@lokal-profil
lokal-profil / test_redux.py
Last active November 9, 2015 19:21
LSH: Script for comparing the results before and after the implementation https://github.com/lokal-profil/LSH/issues/3
#!/usr/bin/python
# -*- coding: UTF-8 -*-
#
# Test for redux equivalency
# run from main directory
#
import codecs
import random
files = ('ausstellung_trim.csv', 'ereignis_trim.csv', 'objMass_trim.csv',
'photo_multimedia_etc.csv', 'kuenstler_trim.csv', 'objDaten_etc.csv',
@lokal-profil
lokal-profil / fixSubCollections.py
Last active November 9, 2015 19:20
NatMus: Script for replacing P195 statements with P518 qualifiers by the value of that qualifier
#!/usr/bin/python
# -*- coding: utf-8 -*-
import pywikibot
import pywikibot.data.wikidataquery as wdquery
COLLECTION_P = '195'
PART_P = '518'
INSTITUTION_Q = '842858'
PREFIX_MAP = {
u'NM': {u'subcol': None, u'place': u'Q%s' % INSTITUTION_Q},
u'NMB': {u'subcol': None, u'place': u'Q%s' % INSTITUTION_Q},
@lokal-profil
lokal-profil / lowerCaseConnections.py
Last active November 9, 2015 19:46
LSH: Script for a one-time replacement of upper case connections in Materials, Keywords and ObjKeywords
#!/usr/bin/python
# -*- coding: utf-8 -*-
#
# Cleanup function for a one-time replacement of upper case connections
# in Materials, Keywords and ObjKeywords
#
import helpers
import codecs
import os
@lokal-profil
lokal-profil / check_WLE_id.py
Created May 14, 2018 15:26
Short script for reproducing WLM style reporting pages for Wikidata powered competitions (here WLE in Sweden)
# -*- coding: utf-8 -*-
# python check_WLE_id.py -live -dir:~/Projects/batchUploadTools/
"""Script for updating unused imges/unknonw ids pages for WLE on sv.wp."""
import pywikibot
import wikidataStuff.wdqsLookup as query
SETTING = {
'prop': 'P3613',
'formatter_url': 'http://skyddadnatur.naturvardsverket.se/sknat/?nvrid={}',
'cat': 'Category:Protected areas of Sweden with known IDs',
@lokal-profil
lokal-profil / Burnchart.js
Last active July 4, 2019 08:28
Burnchart calcs for use in Google Calc sheet
// Backup copy at https://gist.github.com/lokal-profil/31d8651049d1ebf58bc668c7f27ab288
var rangeRef = "Staff!G1" // this is the cell in which the allowed personnel range (on the same sheet) is specified.
var delimiterLookup = {
"COMMA": ",",
"SEMICOLON": ";",
"PERIOD": ".",
"SPACE": " "
}
/**
@lokal-profil
lokal-profil / project_connection_import.py
Last active July 19, 2018 14:14
Import script for project connection on Wikidata
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
Quick pywikibot wrapper for statement addition.
"""
import pywikibot
PROP = 'P5008'
TARGET_Q = 'Q123'
REF_PROP = 'P248'
@lokal-profil
lokal-profil / heritage_log_handling.sh
Last active August 22, 2018 09:22
Shell script for chopping heritage logs to only the last copleted run
#!/bin/bash
#
# Script to make a local copy of the last complete log
# make a local copy of the logs
cp /data/project/heritage/logs/update_monuments.log ./raw.log;
# cut log at last complete
last_complete=$(grep --binary-files=text -n "Done with the update!" raw.log | tail -n1 | cut -f1 -d:);
head -n "$last_complete" raw.log > tmp.log;
@lokal-profil
lokal-profil / clean_logs.py
Last active August 22, 2018 22:43
Additional cleanup step for analysing WLM logs.
#!/usr/bin/python
# -*- coding: utf-8 -*-
import argparse
import re
def clean_log(filename):
import codecs
f_in = codecs.open(filename, 'r', 'utf-8')
f_out = codecs.open('{}.clean'.format(filename), 'w', 'utf-8')
bad_warnings = (