Skip to content

Instantly share code, notes, and snippets.

View duyet's full-sized avatar

duyet duyet

View GitHub Profile
@darktable
darktable / app.yaml
Created March 16, 2011 19:10
GAE: App.yaml designed for serving a static site on Google App Engine (Python). Copy your static html and files into a folder called "static" next to app.yaml. Contains a bunch of mimetype declarations from html5boilerplate's .htaccess. May not be neces
application: you-app-name-here
version: 1
runtime: python
api_version: 1
default_expiration: "30d"
handlers:
- url: /(.*\.(appcache|manifest))
mime_type: text/cache-manifest
@coolaj86
coolaj86 / how-to-publish-to-npm.md
Last active October 29, 2024 21:43
How to publish packages to NPM

Getting Started with NPM (as a developer)

As easy as 1, 2, 3!

Updated:

  • Aug, 08, 2022 update config docs for npm 8+
  • Jul 27, 2021 add private scopes
  • Jul 22, 2021 add dist tags
  • Jun 20, 2021 update for --access=public
  • Sep 07, 2020 update docs for npm version
@vstoykov
vstoykov / force_default_language_middleware.py
Last active October 16, 2024 13:56
Force Django to use settings.LANGUAGE_CODE for default language instead of request.META['HTTP_ACCEPT_LANGUAGE']
try:
from django.utils.deprecation import MiddlewareMixin
except ImportError:
MiddlewareMixin = object
class ForceDefaultLanguageMiddleware(MiddlewareMixin):
"""
Ignore Accept-Language HTTP headers
@majek
majek / udp_server.py
Created February 8, 2012 00:48
Simple python udp server
import logging
import socket
log = logging.getLogger('udp_server')
def udp_server(host='127.0.0.1', port=1234):
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
@MohamedAlaa
MohamedAlaa / tmux-cheatsheet.markdown
Last active November 17, 2024 01:28
tmux shortcuts & cheatsheet

tmux shortcuts & cheatsheet

start new:

tmux

start new with session name:

tmux new -s myname
@adamawolf
adamawolf / Apple_mobile_device_types.txt
Last active November 14, 2024 18:34
List of Apple's mobile device codes types a.k.a. machine ids (e.g. `iPhone1,1`, `Watch1,1`, etc.) and their matching product names
i386 : iPhone Simulator
x86_64 : iPhone Simulator
arm64 : iPhone Simulator
iPhone1,1 : iPhone
iPhone1,2 : iPhone 3G
iPhone2,1 : iPhone 3GS
iPhone3,1 : iPhone 4
iPhone3,2 : iPhone 4 GSM Rev A
iPhone3,3 : iPhone 4 CDMA
iPhone4,1 : iPhone 4S
@rgreenjr
rgreenjr / postgres_queries_and_commands.sql
Last active November 17, 2024 13:10
Useful PostgreSQL Queries and Commands
-- show running queries (pre 9.2)
SELECT procpid, age(clock_timestamp(), query_start), usename, current_query
FROM pg_stat_activity
WHERE current_query != '<IDLE>' AND current_query NOT ILIKE '%pg_stat_activity%'
ORDER BY query_start desc;
-- show running queries (9.2)
SELECT pid, age(clock_timestamp(), query_start), usename, query
FROM pg_stat_activity
WHERE query != '<IDLE>' AND query NOT ILIKE '%pg_stat_activity%'
@gullevek
gullevek / New New Index Bloat Query
Last active July 25, 2022 17:35 — forked from mbanck/New New Index Bloat Query
This version uses pg_stats and not pg_statistics so any user can use this query
WITH btree_index_atts AS (
SELECT nspname, relname, reltuples, relpages, indrelid, relam,
regexp_split_to_table(indkey::text, ' ')::smallint AS attnum,
indexrelid as index_oid
FROM pg_index
JOIN pg_class ON pg_class.oid=pg_index.indexrelid
JOIN pg_namespace ON pg_namespace.oid = pg_class.relnamespace
JOIN pg_am ON pg_class.relam = pg_am.oid
WHERE pg_am.amname = 'btree'
),
@acolyer
acolyer / service-checklist.md
Last active July 10, 2024 05:13
Internet Scale Services Checklist

Internet Scale Services Checklist

A checklist for designing and developing internet scale services, inspired by James Hamilton's 2007 paper "On Desgining and Deploying Internet-Scale Services."

Basic tenets

  • Does the design expect failures to happen regularly and handle them gracefully?
  • Have we kept things as simple as possible?
@0asa
0asa / sklearn-pyspark.py
Created January 27, 2015 11:12
Run a Scikit-Learn algorithm on top of Spark with PySpark
from pyspark import SparkConf, SparkContext
from sklearn.datasets import make_classification
from sklearn.ensemble import ExtraTreesClassifier
import pandas as pd
import numpy as np
conf = (SparkConf()
.setMaster("local[*]")
.setAppName("My app")
.set("spark.executor.memory", "1g"))