Attention: the list was moved to
https://github.com/dypsilon/frontend-dev-bookmarks
This page is not maintained anymore, please update your bookmarks.
# Makefile for transpiling with Babel in a Node app, or in a client- or | |
# server-side shared library. | |
.PHONY: all clean | |
# Install `babel-cli` in a project to get the transpiler. | |
babel := node_modules/.bin/babel | |
# Identify modules to be transpiled by recursively searching the `src/` | |
# directory. |
# -*- coding: utf-8 -*- | |
import asyncio | |
import uvloop | |
from aiohttp.web import Application, MsgType, WebSocketResponse | |
def add_socket(app, socket, user_id): | |
if user_id in app['connections']: | |
pass |
from django.core.management.base import BaseCommand, CommandError | |
from django.db.models import get_models, get_app, fields | |
from django.db.models.fields import related | |
class Command(BaseCommand): | |
help = """Generate factory-boy factories for the given app""" | |
def handle(self, *args, **options): | |
assert len(args) == 1, 'Must specify app name as first and only argument' |
Attention: the list was moved to
https://github.com/dypsilon/frontend-dev-bookmarks
This page is not maintained anymore, please update your bookmarks.
from django.conf import settings | |
def email_template( | |
template_name = None, | |
template_context = None, | |
subject = '', | |
recipients = None, | |
sender = settings.DEFAULT_FROM_EMAIL, | |
fail_silently = False, | |
use_markdown = False, |
(This gist is pretty old; I've written up my current approach to the Pyramid integration on this blog post, but that blog post doesn't go into the transactional management, so you may still find this useful.)
I've created a Pyramid scaffold which integrates Alembic, a migration tool, with the standard SQLAlchemy scaffold. (It also configures the Mako template system, because I prefer Mako.)
I am also using PostgreSQL for my database. PostgreSQL supports nested transactions. This means I can setup the tables at the beginning of the test session, then start a transaction before each test happens and roll it back after the test; in turn, this means my tests operate in the same environment I expect to use in production, but they are also fast.
I based my approach on [sontek's blog post](http://sontek.net/blog/
#!/usr/bin/env python | |
""" | |
This script helps migrating issues from Bitbucket to GitHub. | |
It currently ignores milestones completly and doesn't care whether an issue is | |
open, new or on hold. As long as it's not closed it's considered open. | |
To use it, install python-bitbucket, PyGithub and ipdb. |
class FrealCountdownTask(task.Task): | |
abstract = True | |
@classmethod | |
def apply_async(self, args=None, kwargs=None, | |
task_id=None, producer=None, connection=None, router=None, | |
link=None, link_error=None, publisher=None, add_to_parent=True, | |
**options): | |
try: |
""" | |
'22': | |
ufw.allow: | |
- enabled: true | |
- proto: tcp | |
- from: 127.0.0.1 | |
- to: any | |
""" |
# Non magic version, client is only used to append tokens | |
# All other actions are explicit | |
import requests | |
from requests.auth import AuthBase | |
from oauthlib.oauth2.draft25 import WebApplicationClient | |
from oauthlib.common import urldecode | |
# Very basic auth, only used to append tokens to requests |