-
heka.counter-output: Generated by CounterFilter. Usually picked up by a LogOutput for writing to stdout.
-
heka.plugin-report: Used inside the reporting infrastructure for plugins to provide report data to the dashboard, typically not injected into the router at all.
-
heka.input-report: Used inside the reporting infrastructure for the input recycle chan to provide report data to the dashboard, typically not injected into the router at all.
-
heka.inject-report: Used inside the reporting infrastructure for the inject recycle chan to provide report data to the dashboard, typically not injected into the router at all.
-
heka.router-report: Used inside the reporting infrastructure for the router to provide report data to the dashboard, typically not injected into the router at all.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env bash | |
# Example: | |
# ./find-ecr-image.sh foo/bar mytag | |
if [[ $# -lt 2 ]]; then | |
echo "Usage: $( basename $0 ) <repository-name> <image-tag>" | |
exit 1 | |
fi | |
IMAGE_META="$( aws ecr describe-images --repository-name=$1 --image-ids=imageTag=$2 2> /dev/null )" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package main | |
import ( | |
"context" | |
"flag" | |
"fmt" | |
"log" | |
"net/http" | |
"os" | |
"os/signal" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
production: | |
adapter: postgresql | |
encoding: unicode | |
sslmode: require | |
url: postgres://user:password@host:port/db |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# You don't need Fog in Ruby or some other library to upload to S3 -- shell works perfectly fine | |
# This is how I upload my new Sol Trader builds (http://soltrader.net) | |
# Based on a modified script from here: http://tmont.com/blargh/2014/1/uploading-to-s3-in-bash | |
S3KEY="my aws key" | |
S3SECRET="my aws secret" # pass these in | |
function putS3 | |
{ | |
path=$1 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
require "string" | |
require "math" | |
require "table" | |
require "cjson" | |
local status_codes = {} | |
local request_times = {} | |
local ticker_interval = read_config("ticker_interval") or error("must provide ticker_interval") | |
local percent_thresh = read_config("percent_threshold") or 90 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"AWSTemplateFormatVersion": "2010-09-09", | |
"Description": "CoreOS on EC2: http://coreos.com/docs/running-coreos/cloud-providers/ec2/", | |
"Mappings": { | |
"RegionMap": { | |
"ap-northeast-1": { | |
"AMI": "ami-f9b08ff8" | |
}, | |
"ap-southeast-1": { | |
"AMI": "ami-c24f6c90" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
require("circular_buffer") | |
require("string") | |
require("cjson") | |
local idx_num | |
local stat_value | |
local debug | |
-- # rows, # cols, # seconds per row | |
cbuf = circular_buffer.new(300, 1, 1) |
#A script to post back to Slack via the webhooks API
##why this exists?
Slack's own hubot adapter needs the hubot installation to be accessible via web. This can be problematic in some cases, as a security risk.
This hack let's you run your Hubot behind a firewall, and connect to Slack via the IRC gateway.
To respond, Hubot uses the incoming webhooks end-point of Slack.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Producer | |
Setup | |
bin/kafka-topics.sh --zookeeper esv4-hcl197.grid.linkedin.com:2181 --create --topic test-rep-one --partitions 6 --replication-factor 1 | |
bin/kafka-topics.sh --zookeeper esv4-hcl197.grid.linkedin.com:2181 --create --topic test --partitions 6 --replication-factor 3 | |
Single thread, no replication | |
bin/kafka-run-class.sh org.apache.kafka.clients.tools.ProducerPerformance test7 50000000 100 -1 acks=1 bootstrap.servers=esv4-hcl198.grid.linkedin.com:9092 buffer.memory=67108864 batch.size=8196 |