Skip to content

Instantly share code, notes, and snippets.

View lukeplausin's full-sized avatar

Luke Plausin lukeplausin

View GitHub Profile
@lukeplausin
lukeplausin / home_ssh_server.sh
Last active December 13, 2023 14:58
This script will configure a linux host to become available for inbound connections. It will install sshd and miniupnp (for port forwarding) and duck dns for dns registration.
# This script assumes that you are using a router or device which uses NAT and support upnp,
# you are using ip4 and a debian based linux distro (this covers 90% of home networks and raspberry pi).
# Home routers often won't let you expose port 22, so the port is exposed and mapped with upnp.
# Copy the script to your computer, and then edit the variables at the top.
# Then run the script by typing "bash ./home_ssh_server.sh" into the terminal.
### Variables ###
# This is the public hostname which you will use to connect to your home server.
@lukeplausin
lukeplausin / verify_crowdstrike_installed.ps1
Created March 18, 2021 17:57
This is a powershell script which I wrote to verify that Crowdstrike is installed for deployment with Microsoft Intune (Endpoint Manager)
# This is a powershell script which I wrote to verify that Crowdstrike is installed
# for deployment with Microsoft Intune (Endpoint Manager).
# You can use it for any installer which deploys a windows service, just change the $service_name
# variable from "csagent" to the name of the service. For example, Dropbox for Windows is "DbxSvc".
# Name of the service
$service_name = "csagent"
# Number of retries
$max_attempts = 15
@lukeplausin
lukeplausin / top_modules.txt
Created May 25, 2021 23:41
Top python pypi modules, by downloads
# Ever wondered which modules are the most popular? Well.. here they are. The top 500 PyPI modules, by download
urllib3, 910195765
six, 749120890
botocore, 670113460
python-dateutil, 629757389
pip, 629606070
requests, 626954494
s3transfer, 595019137
certifi, 570148733
@lukeplausin
lukeplausin / pod-copy-data.sh
Last active April 27, 2024 06:42
Copy data from local filesystem into a pod in kubernetes
# If your pod container doesn't have bash in the path, you might need to replace `bash` with `/bin/bash/`, `/bin/sh` or `/bin/ash`.
# Simple string from the command line - puts the literal text 'AAA' into a new file /tmp/test.txt
echo 'AAA' > kubectl -n MyNamespace exec -ti $my_pod_id -c my-pod-container -- bash -c 'cat - > /tmp/test.txt'
# Copy / inject a file from the local filesystem into a new file in the pod in the remote kubernetes cluster
kubectl -n MyNamespace exec -i $my_pod_id -c my-pod-container -- bash -c 'cat - > /tmp/file.txt' </my/input/file.txt
@lukeplausin
lukeplausin / cleanup_git.sh
Last active November 16, 2022 11:48
Clean up merged git branches
#!/bin/zsh
# Do merged git branches clutter up your local machine?
# This handy script can automatically clean them up for you.
# To install - put the script somewhere on the filesystem and add a source command to your .bashrc / .zshrc
# e.g. - 'source ~/code/scrapbook/cleanup_git.sh'
# When the script is sourced you will get two commands -
# "gitclean" - delete merged branches in current folder (with prompt)
@lukeplausin
lukeplausin / auto_configure_aws_cli_sso_roles.sh
Last active February 3, 2025 09:28
Automatically configure AWS SSO configuration file for all available accounts and roles
#!/bin/bash -e
# How to use this script:
# 1. Follow these instructions to configure a single AWS account to do initial login with SSO
# https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-sso.html
# 2. Export AWS_PROFILE=... and then run "aws sso login" to get an SSO token
# 3. Once signed in with AWS SSO, run this script to automatically list out all the other accounts and roles and add them to your config file
# If you want to filter roles / accounts in the process, or validate config before committing it, you can customise the script to do this.
@lukeplausin
lukeplausin / transfer_ssm_file.sh
Last active March 18, 2025 21:19
Transfer a file to EC2 SSM instance without using S3 (SSM only)
# This script will explain how to transfer a file to EC2 using SSM ONLY!
# You will need to have permission to run SSM commands on the target machine and have sudo access as well
# Infos
INSTANCE_ID=i-1234567890
FILE_NAME=the_file.tar.gz
# Step 1: Run command on machine to install netcat and dump from port to filename
# < Start session
@lukeplausin
lukeplausin / docker-compose.yml
Created December 10, 2021 20:30
Home photo server
# From:
# https://dl.photoprism.org/docker/docker-compose.yml
version: '3.5'
# Example Docker Compose config file for PhotoPrism (Linux / AMD64)
#
# Documentation : https://docs.photoprism.org/getting-started/docker-compose/
# Docker Hub URL: https://hub.docker.com/r/photoprism/photoprism/
#
# Please run behind a reverse proxy like Caddy, Traefik or Nginx if you need HTTPS / SSL support
@lukeplausin
lukeplausin / export_terraform_cloud_state_to_bucket_storage.py
Last active October 28, 2023 13:08
Export terraform cloud state files with history to a GCS bucket
# This example shows how you can export your Hashicorp Terraform Cloud state files out of
# Terraform Cloud and into a bucket storage provider, with the full history.
# This example assumes that you have object versioning enabled on your bucket storage.
# The example is written for using Google Cloud Storage, but can be adapted for any storage provider.
# Before you run the code, you need to generate an API token in Terraform Cloud, and export it as "TFE_TOKEN" in your shell.
# pip install terrasnek google-cloud-storage ipython
@lukeplausin
lukeplausin / undelete_gcs_objects.py
Created January 26, 2024 12:55
This gist describes how to undelete files in GCS while retaining most of the metadata. I couldn't find any examples of how to do this other than in the UI. GCS doesn't seem to use delete markers like AWS. If anyone knows a better way please say so.
# This gist describes how to undelete files in GCS while retaining most of the metadata.
# GCS doesn't seem to use delete markers like AWS. If anyone knows a better way please say so in the comments.
# I couldn't find any examples of how to do this other than in the UI.
# GCS documentation doesn't seem to include instructions for restoring versioned objects in python, but I found
# that this method works. Confusingly they use terms like version, generation, and revision interchangably in
# the documentation. I couldn't understand how object deletion works on versioned buckets.
import tempfile
from google.cloud import storage