Skip to content

Instantly share code, notes, and snippets.

View dmichael's full-sized avatar

David Michael dmichael

View GitHub Profile
# Mocha.js test runner support
# https://github.com/jfirebaugh/konacha
Konacha.configure do |config|
config.spec_dir = "spec/javascripts/mocha"
config.spec_matcher = /_spec\.|_test\./
# driver names a Capybara driver used for the run task (try :poltergeist, after installing PhantomJS).
# Without this directive, konacha will try to run this in Firefox
require 'capybara/poltergeist'
config.driver = :poltergeist
@dmichael
dmichael / gist:4741501
Created February 8, 2013 19:53
Retryer usage
# Create the wrapped function with 5 retries
command = @commands.retry('connectToProvider', attempts: 5)
# Create the wrapped function with unlimited retries
command = @commands.retry('connectToProvider')
# Essentially a promise is returned with another method attached
promise = command('yahoo', username: 'x', password: 'y')
# Some bad shit happens
require 'squeel'
class RealtimeMessage < ActiveRecord::Base
# These attributes are "mass-assignable", that is, they come in directly from the serial port as XML
attr_accessible :src, :dsb, :time, :tmprF, :sensor_num, :sensor_type, :radio_id, :ch1_watts, :ch2_watts, :ch3_watts, :mac_address
def self.this_month
now = Time.now
first_of_month = Date.civil(now.year, now.month, 1)
last_of_month = Date.civil(now.year, now.month, -1)
# Thanks Seth Darlington!
price_per_kwh = 0.1248 # dollars
query = RealtimeMessage.this_month
previous = query.first
usage = query[1..query.count].reduce(0) do |sum, message|
elapsed_seconds = (message.created_at - previous.created_at)
elapsed_hours = elapsed_seconds/3600
@dmichael
dmichael / gist:1362424
Created November 13, 2011 18:11
Sigmoid Gradient
function g = sigmoid(z)
g = sigmoid = 1.0 ./ (1.0 + exp(-z));
end
function g = sigmoidGradient(z)
g = sigmoid(z) .* (1 - sigmoid(z));
end
@dmichael
dmichael / gist:1362267
Created November 13, 2011 16:11
Neural Network Regularized Cost Function
regularization = lambda/(2*m) * ( ...
sum(sum((Theta1(:,2:end)).^2)) + ...
sum(sum((Theta2(:,2:end)).^2)) ...
);
J = (1/m) * sum( ...
sum((-y .* log(h))-((1-y) .* log(1-h))) ...
);
J = J + regularization;
@dmichael
dmichael / gist:1362210
Created November 13, 2011 15:14
Neural Network Cost Function
J = (1/m) * sum( ...
sum((-y .* log(h))-((1-y) .* log(1-h))) ...
);
@dmichael
dmichael / gist:1362199
Created November 13, 2011 15:08
Neural Network Feedforward
m = size(X, 1);
a1 = [ones(m, 1) X]; % Add the biases to the input matrix
z2 = a1 * Theta1';
a2 = sigmoid(z2);
n = size(a2, 1);
a2 = [ones(n,1) a2]; % Add the biases
z3 = a2 * Theta2';
@dmichael
dmichael / gist:1329789
Created November 1, 2011 03:22
Sigmoid function
function g = sigmoid(z)
% Compute the sigmoid of z.
% z can be a matrix, vector or scalar.
g = 1.0 ./ (1.0 + exp(-z));
% 1./ (1 + e.^-z); <-- this is equivalent
end
hypothesis = sigmoid(X*theta);
@dmichael
dmichael / gist:1329765
Created November 1, 2011 03:14
logistic-regression-cost.m
function [J, grad] = costFunction(theta, X, y)
% Logistic Regression cost function and partial derivative (gradient)
m = length(y); % number of training examples
cost = (y' * log(h)) + ((1-y')*log(1-h));
J = -1/m * cost;
grad = 1/m * (X' * (h - y));