Skip to content

Instantly share code, notes, and snippets.

View bcambel's full-sized avatar
🌴
On vacation

Bahadir Cambel bcambel

🌴
On vacation
View GitHub Profile
@bcambel
bcambel / intro.scala
Created November 23, 2011 11:56
Introduction to Scala
//def <function name>(<parameter-name>:<parameter-type){
def changingBalance(account:Account){
println("======Starting balance :" + account.balance)
println("Depositing $10.0")
account.deposit(10.0d)
println("new balance:" + account.balance)
println("Withdrawing $5.00")
account.withdraw(5.0)
println("New balance is "+ account.balance)
@bcambel
bcambel / gist:1537176
Created December 30, 2011 01:39 — forked from joerussbowman/gist:1176002
Tornado Twitter Stream
# not pretty, I quickly moved to another idea
# not involving the Twitter stream. Good starting
# point.
import re
import base64
import socket
import asyncmongo
from tornado import ioloop
from tornado import iostream
# Author: Pieter Noordhuis
# Description: Simple demo to showcase Redis PubSub with EventMachine
#
# Update 7 Oct 2010:
# - This example does *not* appear to work with Chrome >=6.0. Apparently,
# the WebSocket protocol implementation in the cramp gem does not work
# well with Chrome's (newer) WebSocket implementation.
#
# Requirements:
# - rubygems: eventmachine, thin, cramp, sinatra, yajl-ruby
@bcambel
bcambel / node-reader.js
Created April 4, 2012 18:49 — forked from dongyuwei/node-reader.js
google reader api for nodejs
/*
This library was developed by Will Honey.
It is licensed under the GPLv3 Open Source License
This library requires the underscore library found at http://documentcloud.github.com/underscore/
This library requires the underscore string library found at http://edtsech.github.com/underscore.string/
This library requires the support of localStorage. Updates could be easily made to change that.
*/
/* jslint adsafe: false, devel: true, regexp: true, browser: true, vars: true, nomen: true, maxerr: 50, indent: 4 */
@bcambel
bcambel / gist:2622001
Created May 6, 2012 12:21 — forked from osiloke/gist:1138798
Generating URLs to crawl from outside a Scrapy spider
from scrapy import log
from scrapy.item import Item
from scrapy.http import Request
from scrapy.contrib.spiders import XMLFeedSpider
def NextURL():
"""
Generate a list of URLs to crawl. You can query a database or come up with some other means
Note that if you generate URLs to crawl from a scraped URL then you're better of using a
@bcambel
bcambel / sql_mongodb_dump_middleware.py
Created May 21, 2012 09:12 — forked from kesor/sql_mongodb_dump_middleware.py
Django MongoDB + SQL dump middleware
from django.core.exceptions import MiddlewareNotUsed
from django.conf import settings
from django.db import connection
from pymongo.connection import Connection
from time import time
import struct
import bson
from bson.errors import InvalidBSON
class SqldumpMiddleware(object):
@bcambel
bcambel / gist:2844136
Created May 31, 2012 15:24
DDAGENTSRV_LOG
2012-05-31 15:17:40,719 - root - INFO - Logging to /tmp/dd-agent.log
2012-05-31 15:17:40,719 - root - WARNING - Pid file: /tmp/dd-agent.pid
2012-05-31 15:17:40,719 - root - INFO - Running in foreground
2012-05-31 15:17:40,719 - agent - DEBUG - Collecting basic system stats
2012-05-31 15:17:40,738 - agent - DEBUG - System: {'nixV': ('Ubuntu', '10.04', 'lucid'), 'cpuCores': 2, 'machine': 'x86_64', 'platform': 'linux2', 'pythonV': '2.6.5', 'processor': ''}
2012-05-31 15:17:40,738 - agent - DEBUG - Creating checks instance
2012-05-31 15:17:40,785 - agent - INFO - Running on EC2, instanceId: i-fdf2929e
2012-05-31 15:17:40,786 - checks - INFO - Dogstream parsers: []
2012-05-31 15:17:40,788 - checks - INFO - Starting checks
2012-05-31 15:17:40,788 - checks - DEBUG - SIZE: <function getApacheStatus at 0x2ca5a28> wrote 5 bytes uncompressed
@bcambel
bcambel / latency.txt
Created June 3, 2012 22:37 — forked from jboner/latency.txt
Latency numbers every programmer should know
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns
Send 1K bytes over 1 Gbps network 10,000 ns 0.01 ms
Read 4K randomly from SSD 150,000 ns 0.15 ms
Read 1 MB sequentially from memory 250,000 ns 0.25 ms
Round trip within same datacenter 500,000 ns 0.5 ms
@bcambel
bcambel / index.html
Created June 8, 2012 23:42 — forked from ralphbean/index.html
d3 viz for narcissus
<!DOCTYPE html>
<html>
<head>
<title>Proof of concept for d3 viz + narcissus</title>
<link rel="stylesheet" type="text/css" href="spider.css"></style>
<script type="text/javascript" src="http://mbostock.github.com/d3/d3.min.js"></script>
<script type="text/javascript" src="http://mbostock.github.com/d3/d3.geom.min.js"></script>
<script type="text/javascript" src="http://mbostock.github.com/d3/d3.layout.min.js"></script>
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js"></script>
<script type="text/javascript" src="spider.js"></script>
@bcambel
bcambel / sample.config
Created July 31, 2012 21:28 — forked from sidupadhyay/sample.config
Sample HA Proxy Config for Tornado/Socket.io Backends
global
maxconn 10000 # Total Max Connections. This is dependent on ulimit
nbproc 2
defaults
mode http
option redispatch
maxconn 2000
contimeout 5000
clitimeout 50000