Skip to content

Instantly share code, notes, and snippets.

View thom-vend's full-sized avatar
🤿

Thom'H thom-vend

🤿
  • Auckland, New Zealand
  • 12:36 (UTC +13:00)
View GitHub Profile
#!/usr/bin/env python3
from icecream import ic
"""
Cost of burstable CPU credit above baseline:
- $0.09 per vCPU-Hour for Aurora Standard
- $0.12 per vCPU-Hour for Aurora I/O-Optimized clusters
Baseline cannot be foudn for aurora, but should be the same as ec2:
https://aws.amazon.com/ec2/instance-types/t4/
@thom-vend
thom-vend / kafka-python-describe-log-dir-usage-example.py
Created January 7, 2024 20:57
Kafka python: describe log dir example
import kafka
from kafka.protocol.api import Request, Response
from kafka.protocol.types import Array, Boolean, Bytes, Int8, Int16, Int32, Int64, Schema, String
# Neither kafka-python nor confluent-kafka-python have implemented the DescribeLogDirsRequest
# WIP PR on kafka-python: see https://github.com/dpkp/kafka-python/pull/2278
# Backported stuff from the PR to support DescribeLogDirs with kafka-python
class DescribeLogDirsResponse_v0(Response):
API_KEY = 35
API_VERSION = 0
@thom-vend
thom-vend / import-route53-zone-to-terraform.sh
Last active December 13, 2022 23:12
Script to import in terraform route53 dns zone and record when terraformer isn't working as expected
#!/usr/bin/env bash
# WARNING: You probably want to look at https://github.com/GoogleCloudPlatform/terraformer
# But was't working great for me so I end-up searching for a more custom solution
# Script isn't perfect but as it was a one shot operation....
# Script work on my machine TM, using some gnu core utils like gsed, gwc etc
# refactor of https://www.daniloaz.com/en/how-to-quickly-import-all-records-from-a-route53-dns-zone-into-terraform/
set -x
set -euo pipefail
# This script retrieves all DNS records from AWS Route53 DNS zone and imports all of them to Terraform

Debug your github action build via ssh

  • Store action.yml into .github/actions/debugviassh/action.yml
  • Add your ngrok.com token in an env var NGROK_TOKEN
  • Add your public ssh key in another env var SSH_PUBLIC_KEY
  • Add the action call in your workflow where you want to stop and get ssh access
  • Find host/port to connect in https://dashboard.ngrok.com/cloud-edge/endpoints
  • ssh -l runner -p $ngrok_port $ngrok_host

note:

#!/usr/bin/env python3
"""
Downloaded from https://gist.github.com/rams3sh/15ac9487f2b6860988dc5fb967e754aa
Craft a web request to the AWS rest API and hit an endpoint that actually works but isn't supported in the boto3 or AWS CLI
Based on https://gist.github.com/andrewmackett/5f73bdd29aeed4728ecaace53abbe49b
Usage :- python3 rds_log_downloader.py --region <region> --db <db_name> --logfile <log_file_to_download> --output <output_file_path>
5.101.104.0/22
5.101.108.0/24
5.101.109.0/24
5.101.110.0/24
5.101.111.0/24
5.101.96.0/21
37.139.0.0/19
45.55.0.0/19
45.55.100.0/22
45.55.104.0/22
5.101.104.0/22
5.101.108.0/24
5.101.109.0/24
5.101.110.0/24
5.101.111.0/24
5.101.96.0/21
37.139.0.0/19
45.55.0.0/19
45.55.100.0/22
45.55.104.0/22
103.253.144.0/22
104.131.0.0/18
104.131.128.0/20
104.131.144.0/20
104.131.160.0/20
104.131.176.0/20
104.131.192.0/19
104.131.224.0/19
104.131.64.0/18
104.236.0.0/18
#!/usr/bin/env python3
import os
import sys
import argparse
import logging
# Edit this function to change the behavior
def transformation(content):
content = content.decode('utf-8').replace('\u200c', '')
@thom-vend
thom-vend / zookeeper-read-write-test-tool.py
Last active June 2, 2021 01:44
Zookeeper read-write test job in python3
#!/usr/bin/env python3
from datetime import datetime
from icecream import ic
from kazoo.client import KazooClient
from multiprocessing import Pool
from pprint import pprint
import argparse
import logging
import random
import signal