Skip to content

Instantly share code, notes, and snippets.

View ryanlecompte's full-sized avatar

Ryan LeCompte ryanlecompte

View GitHub Profile
@viktorklang
viktorklang / SerializedExecutionContext.scala
Created January 17, 2013 00:32
Wraps an ExecutionContext into a new ExecutionContext which will execute its tasks in sequence, always.
import java.util.concurrent.ConcurrentLinkedQueue
import java.util.concurrent.atomic.AtomicInteger
import scala.concurrent.ExecutionContext
import scala.util.control.NonFatal
import scala.annotation.tailrec
object SerializedExecutionContext {
def apply(batchSize: Int)(implicit context: ExecutionContext): ExecutionContext = {
require(batchSize > 0, s"SerializedExecutionContext.batchSize must be greater than 0 but was $batchSize")
new ConcurrentLinkedQueue[Runnable] with Runnable with ExecutionContext {
private final val on = new AtomicInteger(0)
@ahoward
ahoward / render.rb
Created August 20, 2012 17:19
rendering outside a controller
av = ActionView::Base.new(Rails.application.config.paths['app/views'].first)
### av.controller = Current.mock_controller # gem install rails_current
av.render( :locals => {:model => Model.first}, :partial => "shared/model" ) #=> string
@viktorklang
viktorklang / linearize.scala
Created August 14, 2012 10:02
A Linearized version of Sequence
import scala.concurrent._
import scala.collection.mutable.Builder
import scala.collection.generic.CanBuildFrom
import language.higherKinds
/**
* Linearize asynchrnously applies a given function in-order to a sequence of values, producing a Future with the result of the function applications.
* Execution of subsequent entries will be aborted if an exception is thrown in the application of the function.
*/
def linearize[T, U, C[T] <: Traversable[T]](s: C[T])(f: T => U)(implicit cbf: CanBuildFrom[C[T], U, C[U]], e: ExecutionContext): Future[C[U]] = {
@stouset
stouset / composition.rb
Created August 8, 2012 15:15
Function Composition
class Proc
def +(other)
->(*a1, &b1) do
->(*a2, &b2) do
inner = other.to_proc.call(*a1, &b1)
outer = self .call(inner, *a2, &b2)
end
end
end
end
#!/usr/bin/ruby
require 'time'
def format_time(seconds)
hours = (seconds / 3600).to_i
minutes = ((seconds % 3600) / 60).to_i
seconds = (seconds % 60).to_i
minutes = "0#{minutes}" if minutes < 10
seconds = "0#{seconds}" if seconds < 10
#
# Ideas stolen from lograge and brought to Rails 2.3
# https://github.com/mattmatt/lograge/blob/master/lib/lograge/log_subscriber.rb
#
module ImprovedControllerLogging
def self.included(base)
base.alias_method_chain :log_processing, :fixup
base.inject_alias_method_chain :perform_action,
:perform_action_with_benchmark,
@sausheong
sausheong / circularlist.rb
Created April 29, 2012 05:09
Circular doubly-linked list
class CircularList < Array
def index
@index ||=0
@index.abs
end
def current
@index ||= 0
get_at(@index)
end
@ryandotsmith
ryandotsmith / process-partitioning.md
Created April 13, 2012 06:40
Process Partitioning

Process Partitioning

The Problem

When working with large, high volume, low latency systems, it is often the case that processing data sequentially becomes detrimental to the system's health. If we only allow 1 process to work on our data we run into several challenges:

  • Our process may fall behind resulting in a situation which it is impossible for our process to catch up.
  • Our singleton process could crash and leave our system in a degraded state.
  • The average latency of data processing could be dramatically affected by outlying cases.
class Proxy
def initialize(obj, modules)
@obj = obj
modules.each do |mod|
extend mod
end
end
def method_missing(meth, *args, &block)
module Papertrail
module Zookeeper
class NodeWatcher
include Watchable
attr_reader :name
def initialize(zk, name, options = {})
@zk = zk
@name = name