Skip to content

Instantly share code, notes, and snippets.

View alexhanh's full-sized avatar

Alexander Hanhikoski alexhanh

View GitHub Profile
ENV["RAILS_ENV"] ||= "development"
require File.expand_path('../config/environment', __FILE__)
rewards = {
1000 => 1,
100 => 5,
50 => 10,
20 => 50
}
require 'haversine'
require 'json'
require 'time'
require 'date'
data = JSON.parse(File.read('LocationHistory.json'))
jyvaskyla = [62.244747, 25.7472184]
away_days = {}
File.open('foo.rb', 'w') do |file|
file.write(%{
class Foo
def initialize
end
def bar
1
end
end
@alexhanh
alexhanh / gist:5012883
Created February 22, 2013 11:39
Playing around with Grunt tasks. This is a simple cache buster that revisions given files with md5 hash and also updates the corresponding file references in index.html.
// Simple cache buster that revisions src files with hash and replaces the new filepaths in index.html
grunt.registerMultiTask('cachebuster', function() {
var indexSrc = grunt.file.read(".tmp/index.html");
this.filesSrc.forEach(function(file) {
var src = grunt.file.read(file);
var hash = crypto.createHash('md5').update(src).digest("hex");
var ext = path.extname(file);
var basename = path.basename(file, ext);
grunt.loadNpmTasks('grunt-contrib'); // Loads all grunt-contrib-* tasks, including the cssmin
class Base < ActiveRecord::Base
def self.select_all(array)
sql = self.send(:sanitize_sql_array, array)
connection.select_all(sql)
end
end
irb(main):021:0> Stream.first.tags
Stream Load (2.3ms) SELECT "streams".* FROM "streams" LIMIT 1
Tag Load (1.6ms) SELECT "tags".* FROM "tags" INNER JOIN "taggings" ON "tags"."id" = "taggings"."tag_id" WHERE "taggings"."stream_id" = 818
=> [#<Tag id: 1, name: "korean">]
irb(main):022:0> Stream.includes(:tags).first.tags
Stream Load (1.9ms) SELECT "streams".* FROM "streams" LIMIT 1
Tagging Load (0.5ms) SELECT "taggings".* FROM "taggings" WHERE "taggings"."stream_id" IN (818)
Tag Load (0.4ms) SELECT "tags".* FROM "tags" WHERE "tags"."id" IN (1)
=> [#<Tag id: 1, name: "korean">]
def better_fetch
data = {}
io = nil
begin
io = open(...)
rescue OpenURI::HTTPError => e
# Handle interesting http errors
rescue Timeout::Error, StandardError
# It was most likely a Net/Socket error we are not interested in, so just fail.
def fetch
data = {}
begin
io = open(URI.encode(@baseurl))
doc = Nokogiri::HTML(io)
io.close!
# ... process doc ...
end
rescue Errno::ETIMEDOUT
return nil
#!/bin/bash
while [ 1=1 ]
do
echo $(sysbench --test=cpu --max-time=2 run | egrep "events") -- $(date + %H:%M:%S)
done