Skip to content

Instantly share code, notes, and snippets.

@michalc
michalc / elevator-saga.js
Last active September 1, 2024 14:01
My Elevator Saga solution
{
init: function(elevators, floors) {
const intSort = function(a, b) {
if (a < b) return -1;
if (a > b) return 1;
return 0;
}
const floorWantsUp = function(floorNum) {
@michalc
michalc / restore.py
Last active July 17, 2024 16:15
Restore all objects under S3 prefix in versioned bucket
# Restores all objects under a prefix in an S3 bucket to the version they were before being deleted.
# Works by deleting the latest delete marker for the objects.
import boto3
bucket = 'the-bucket'
prefix = 'the-prefix/'
region = 'eu-west-2'
client = boto3.client('s3', region_name=region)
paginator = client.get_paginator('list_object_versions')
@michalc
michalc / block_or_not.py
Created April 24, 2024 13:34
Script for checking if any PostgreSQL session blocks another
# docker run --rm -it -p 5432:5432 -e POSTGRES_PASSWORD=password postgres:14
import pprint
import threading
import time
from concurrent.futures import ThreadPoolExecutor
from contextlib import contextmanager
import psycopg2
import psycopg2.extras
@michalc
michalc / cloudfoundry-check-stacks.py
Created November 14, 2023 07:38
Check the stack of apps in CloudFoundry
import json
import os
from urllib.parse import urlparse, parse_qsl
import requests
from rich import box
from rich.console import Console
from rich.table import Table
with open(f'{os.environ["HOME"]}/.cf/config.json') as f:
@michalc
michalc / s3_bulk_delete.py
Last active October 25, 2024 04:46
Bulk delete files from an AWS S3 bucket in Python using multiple threads via a parallel (eventually) depth-first search
# Deletes objects in bulk using boto3's delete_objects, but using multiple threads to achieve some
# parallelism - in spite of the GIL multiple HTTP requests to S3 should happen at the same time.
# Instead of looping over all keys under the root prefix, it walks the tree of keys of delimiter-
# defined "folders" in a depth-first way, which allows each page to be processed by a separate
# thread as its discovered. Depth-first is done because the depth is limited by the maximum key
# size of 1024 in S3, and so means that there is a limit to the memory used by the algorithm to
# store the next requests to make. This would not be the case with breadth-first because there is
# no limit to how many keys are in any folder.
#
# To do the search in parallel, each bit of work (i.e. an HTTP request to fetch a page of keys
@michalc
michalc / decrypt-ses-emails-in-s3.py
Last active November 6, 2024 08:18
Decrypt KMS-encrypted SES emails in an S3 bucket
import base64
import json
import boto3
from cryptography.hazmat.primitives.ciphers.aead import AESGCM
s3 = boto3.resource('s3')
bucket = s3.Bucket('my-bucket')
kms_client = boto3.client('kms')
@michalc
michalc / make_8gb_legacy_zip.py
Created January 3, 2022 16:37
Generating a Zip 2.0 file that's (just under) 8GiB
# Often it's claimed that a Zip 2.0 file cannot be bigger than 4GiB
# Here's how to make one that's just under 8GiB
from datetime import datetime
from stream_zip import stream_zip, ZIP_32
now = datetime.now()
perms = 0o600
def files():
for i in range(0, 0xffff):
class MyPipeline(_PipelineV2):
# Everything is a method so nothing happens on import time for flexibility (although possibly
# does a bit of discovery magic... need to think about that...)
# Everything is a _static_ method: nothing on self since things are run on different bits of hardware,
# and gets any run-time dependencies injected in
#
# _PipelineV2 won't actually have any code: other parts of the system will interrogate its
# subclasses as needed. For example
# - Code in Data Flow would construct a DAG
# - The test harness would the run this and upstream pipelines synchronously
@michalc
michalc / libcrypto-decrypt-aes-ctr-little-endian.py
Last active September 13, 2021 05:27
Use libcrypto (OpenSSL) directly from Python with ctypes without compiling anything: AES decrypt with a little endian CTR counter
from contextlib import contextmanager
from ctypes import POINTER, cdll, c_char_p, c_void_p, c_int, create_string_buffer, byref
from sys import platform
# Uses a _little_ endian CTR counter, which OpenSSL doesn't directly support.
# Could be used to decrypt AES-encrypted ZIP files
def decrypt_aes_256_ctr_little_endian(
key, ciphertext_chunks,
get_libcrypto=lambda: cdll.LoadLibrary({'linux': 'libcrypto.so', 'darwin': 'libcrypto.dylib'}[platform])
):
@michalc
michalc / postman-hawk.js
Last active September 21, 2021 05:44
Postman pre-request script for Hawk authentication in custom header
/*****************************************************************************/
const hawkId = pm.variables.get('hawk_id');
const hawkKey = pm.variables.get('hawk_key');
const hawkHeader = pm.variables.get('hawk_header') || 'authorization';
/*****************************************************************************/
const timestamp = parseInt(new Date().getTime() / 1000);
const nonce = CryptoJS.enc.Base64.stringify(CryptoJS.lib.WordArray.random(6));