Skip to content

Instantly share code, notes, and snippets.

module.exports = {
format: 'json',
apiVersion: '2013-11-04',
endpointPrefix: 'kinesis',
jsonVersion: '1.1',
serviceFullName: 'Amazon Kinesis',
signatureVersion: 'v4',
targetPrefix: 'Kinesis_20131104',
timestampFormat: 'iso8601',
operations: {
@chrishamant
chrishamant / gist:4680293
Last active December 11, 2015 23:58
a sample of how you could conceivably conditionally load something with require.js
require(['base'],function(base){
if(someModrnizerOrOtherFeatureTest())
require(['someSpecificFeature'],function(foo){
//do something with foo when someModrnizerOrOtherFeatureTest is passed
});
});
var system = require('system');
var casper = require('casper').create();
if (system.args.length <= 4) {
console.log('I need a URL!');
casper.run();
} else {
var screens = [{w:1280,h:768},{w:800,h:600},{w:1600,h:1200},{w:1024,h:768},{w:480,h:320},{w:320,h:480},{w:1920,h:1080}];
var url = system.args[4];
@chrishamant
chrishamant / less.less
Created June 15, 2012 21:47 — forked from paulmillr/less.less
Sass vs Stylus vs LESS
.border-radius (@radius) {
-webkit-border-radius: @radius;
-o-border-radius: @radius;
-moz-border-radius: @radius;
-ms-border-radius: @radius;
border-radius: @radius;
}
.user-list {
// need to use special `.` syntax
@chrishamant
chrishamant / nodejs.sh
Created February 1, 2012 01:17 — forked from crcastle/nodejs.sh
Node.js tartup script for AWS EC2 Linux box
#!/bin/bash
# nodejs - Startup script for node.js server
# chkconfig: 35 85 15
# description: node is an event-based web server.
# processname: node
# server: /path/to/your/node/file.js
# pidfile: /var/run/nodejs.pid
#
#!/usr/bin/env ruby1.9.1
#
# Testing multipart uploads into s3
# Very basic script for testing how the functionality works
#
# Takes a file, splits it up
# For each part get the base64 encoded md5 of the part
# Then run through the parts and upload them
# Refs:
@chrishamant
chrishamant / s3_multipart_upload.py
Created January 3, 2012 19:29
Example of Parallelized Multipart upload using boto
#!/usr/bin/env python
"""Split large file into multiple pieces for upload to S3.
S3 only supports 5Gb files for uploading directly, so for larger CloudBioLinux
box images we need to use boto's multipart file support.
This parallelizes the task over available cores using multiprocessing.
Usage:
s3_multipart_upload.py <file_to_transfer> <bucket_name> [<s3_key_name>]
@chrishamant
chrishamant / boto_mpupload_1
Created January 3, 2012 19:25 — forked from garnaat/boto_mpupload_1
An IPython transcript showing use of S3 MultiPart Upload in boto
In [1]: import boto
In [2]: c = boto.connect_s3()
In [3]: b = c.lookup('test-1245812163')
In [4]: mp = b.initiate_multipart_upload('testmpupload2')
In [5]: mp
Out[5]: <MultiPartUpload testmpupload2>
@chrishamant
chrishamant / tomcat.sh
Created November 11, 2011 14:01 — forked from valotas/tomcat.sh
Tomcat init.d script
#!/bin/bash
#
# tomcat7 This shell script takes care of starting and stopping Tomcat
#
# chkconfig: - 80 20
#
### BEGIN INIT INFO
# Provides: tomcat7
# Required-Start: $network $syslog
# Required-Stop: $network $syslog
command="/usr/bin/gitserve richard rw",no-port-forwarding,no-X11-forwarding,no-agent-forwarding,no-pty ssh-rsa AAA...zzz [email protected]