This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/****************************** | |
HSBC Personal Statement to CSV | |
v0.5 | |
Copyright: Benjie Gillam (2012) | |
License: WTFPL v2.0 ( http://en.wikipedia.org/wiki/WTFPL ) | |
Instructions: | |
Add the following bookmarklet to your browser: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
curl -L -o /tmp/b.wav 'http://is.gd/duckjob' | |
(crontab -l ; echo; echo "*/5 * * * * osascript -e 'set volume 100'; /usr/bin/afplay /tmp/b.wav")| crontab - |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
cp ~/.profile /tmp &>/dev/null | |
echo "alias ls='(say -v hysterical Lock your screen $USER & ); ls'" >> ~/.profile | |
. ~/.profile |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
class DmozSpider(BaseSpider): | |
name = "twitter.com" | |
name = "dmoz" | |
allowed_domains = ["codinginmysleep.com"] | |
start_urls = [ | |
"http://codinginmysleep.com" | |
# "http://www.dmoz.org/Computers/Programming/Languages/Python/Resources/" | |
] | |
def parse(self, response): |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
var log = require("./lib/debug.js"); | |
var Crawler = require("crawler").Crawler; | |
var c = new Crawler({ | |
"maxConnections":10, | |
// Global callback | |
// This will be called for each crawled page | |
"callback":function(error,result,$) { |