Version: 1.9.8
Platform: x86_64
First, install or update to the latest system software.
sudo apt-get update
sudo apt-get install build-essential chrpath libssl-dev libxft-dev
user www-data; | |
worker_processes 1; | |
pid /var/run/nginx.pid; | |
events { | |
worker_connections 1024; | |
} | |
http { | |
sendfile on; |
""" | |
=========== | |
Description | |
=========== | |
Simple script to copy and gzip static web files to an AWS S3 bucket. S3 is great for cheap hosting of static web content, but by default it does not gzip CSS and JavaScript, which results in much larger data transfer and longer load times for many applications | |
When using this script CSS and JavaScript files are gzipped in transition, and appropriate headers set as per the technique described here: http://www.jamiebegin.com/serving-compressed-gzipped-static-files-from-amazon-s3-or-cloudfront/ | |
* Files overwrite old versions |
{% extends 'layouts/backend.html' %} | |
{% load i18n %} | |
{% block body %} | |
<div class="row"> | |
<div class="col-md-6"> | |
<form id="omni-form" action="{% url 'post_list' %}" method="GET" class="form-inline" role="form"> | |
<div class="input-group"> | |
<label class="sr-only" for="omnisearch">{% trans 'Search for posts' %}</label> | |
<input type="text" name="q" class="form-control" id="omnisearch" |
# An example of how to use AWS SNS with Python's boto | |
# By Stuart Myles @smyles | |
# http://aws.amazon.com/sns/ | |
# https://github.com/boto/boto | |
# | |
# Inspired by parts of the Ruby SWF SNS tutorial http://docs.aws.amazon.com/amazonswf/latest/developerguide/swf-sns-tutorial-implementing-activities-poller.html | |
# And the Python SNS code in http://blog.coredumped.org/2010/04/amazon-announces-simple-notification.html and http://awsadvent.tumblr.com/post/37531769345/simple-notification-service-sns | |
import boto.sns as sns | |
import json |
import os | |
import pytest | |
from alembic.command import upgrade | |
from alembic.config import Config | |
from project.factory import create_app | |
from project.database import db as _db | |
""" | |
py.test demo | |
~~~~~~~~~~~~ | |
#yolo | |
""" | |
from pytest import raises, mark, fixture | |
#!/usr/bin/env python | |
# | |
# Very basic example of using Python and IMAP to iterate over emails in a | |
# gmail folder/label. This code is released into the public domain. | |
# | |
# RKI July 2013 | |
# http://www.voidynullness.net/blog/2013/07/25/gmail-email-with-python-via-imap/ | |
# | |
import sys | |
import imaplib |
This is a mix between two sources:
basically the first resource is great but didn't work for me: I had to remove the trailing "/*" in the resource string to make it work. I also noticed that setting the policy on the source bucket was sufficient. In the end these are the exact steps I followed to copy data between two buckets on two accounts
Basically the idea there is:
# coding: utf-8 | |
from fabric.api import env, cd, run, sudo | |
from fabric.contrib.files import exists | |
from fabric.colors import green | |
from config import (HOST, USER, PASSWORD, APP_DIR, | |
ENV_HOME, PROJECT_NAME, VIRTUALENVS, REPOSITORY, BRANCH) | |
env.hosts = [HOST] | |
env.user = USER |