Skip to content

Instantly share code, notes, and snippets.

@innermond
innermond / s3up
Created February 9, 2016 08:46
Minify, compress and put to an s3 bucket. It depends on awscli
#!/bin/bash
root=~/Projects/printuridigital.ro
src=$root/src
dist=$root/dist
what=$1
options_s3=$2
if [ -z $1 ]; then echo need file argument; exit; fi
type=${1##*.}
file_src=$src/$type/$what
file_min=$dist/$type/$what
{
"type": "record",
"name": "amazon-log",
"namespace": "com.streambright.avro",
"fields": [{
"name": "bucket_owner",
"type": "string",
"doc": "The canonical user ID of the owner of the source bucket."
}, {
"name": "bucket",
@kwilczynski
kwilczynski / disable-ipv6.sh
Last active December 22, 2025 00:41
Amazon Linux OS tweaks
#!/bin/bash
set -u
set -e
set -o pipefail
export PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
cat <<'EOF' > /etc/modprobe.d/blacklist-ipv6.conf
@mlapida
mlapida / EC2-Tag-Assets-Lambda.py
Last active January 17, 2024 08:10
A lambda function that will copy EC2 tags to all related Volumes and Network Interfaces. A full writeup can be found on my site https://empty.coffee/tagging-and-snapshotting-with-lambda/ - Thank you to the community for keeping this updated!
from __future__ import print_function
import json
import boto3
import logging
#setup simple logging for INFO
logger = logging.getLogger()
logger.setLevel(logging.ERROR)
@mlapida
mlapida / CloudWatchLogsKinesisFirehose-Lambda.py
Last active October 3, 2019 01:01
A short Lambda Function the can be sent CloudWatch Logs (in the case Flow Logs) and send them to Kinesis Firehose for storage in S3. A full writeup can be found on my site http://mlapida.com/thoughts/exporting-cloudwatch-logs-to-s3-lambda
import boto3
import logging
import json
import gzip
from StringIO import StringIO
logger = logging.getLogger()
logger.setLevel(logging.INFO)
client = boto3.client('firehose')
@mlapida
mlapida / EBS-Orphaned-Report-Lambda.py
Last active April 29, 2022 08:10
Generate a report of orphaned EBS volumes and send an SNS. A full writeup can be found on my site http://mlapida.com/thoughts/lambda-tracking-orphaned-ebs-volumes
import boto3
import logging
from datetime import *
#setup simple logging for INFO
logger = logging.getLogger()
logger.setLevel(logging.WARNING)
#define the connection
ec2 = boto3.resource('ec2', region_name="us-west-2")
@mlapida
mlapida / EC2-Stopped-Tagged-Lambda.py
Last active May 29, 2024 06:00
Using a lambda function, stop all instances that are tagged appropriately.
import boto3
import logging
#setup simple logging for INFO
logger = logging.getLogger()
logger.setLevel(logging.INFO)
#define the connection
ec2 = boto3.resource('ec2')
@mlapida
mlapida / EC2-Snapshot-Lambda.py
Last active April 29, 2022 08:10
A lambda function for taking a snapshot of all EC2 instances in a region and cleaning the snapshots up after a set number of days. This ones now works best in conjunction with Asset Tagging https://gist.github.com/mlapida/931c03cce1e9e43f147b A full write up can be found on my site https://empty.coffee/tagging-and-snapshotting-with-lambda/
import boto3
import logging
import datetime
import re
import time
#setup simple logging for INFO
logger = logging.getLogger()
logger.setLevel(logging.ERROR)
@jasonrdsouza
jasonrdsouza / combineS3Files.py
Last active June 3, 2023 17:22
Python script to efficiently concatenate S3 files
'''
This script performs efficient concatenation of files stored in S3. Given a
folder, output location, and optional suffix, all files with the given suffix
will be concatenated into one file stored in the output location.
Concatenation is performed within S3 when possible, falling back to local
operations when necessary.
Run `python combineS3Files.py -h` for more info.
'''
@peterwells
peterwells / aws.rb
Last active August 17, 2023 18:11
This Gist shows how I was able to implement AWS assume role functionality within my application using the Ruby AWS SDK, Fog, and Carrierwave. Currently Fog has no concept of one role assuming another. They do, however, have an existing mechanism for re-negotiating soon-to-expire credentials. By monkey-patching this function, we can leverage the …
#resides in app/models/connectors/
class Connectors::Aws
attr_reader :aws_access_key_id, :aws_secret_access_key, :aws_security_token, :expires_at
def initialize
if ENV.has_key?('AWS_SECURITY_TOKEN') #localhost
@aws_access_key_id = ENV['AWS_ACCESS_KEY_ID']
@aws_secret_access_key = ENV['AWS_SECRET_ACCESS_KEY']
@aws_security_token = ENV['AWS_SECURITY_TOKEN']
@expires_at = Time.now + 10.hours