I hereby claim:
- I am locoxella on github.
- I am locoxella (https://keybase.io/locoxella) on keybase.
- I have a public key ASDR9vY5gstHMAcW2a56K9yXH9LWG-SXvepByT43agaOXwo
To claim this, I am signing this object:
I hereby claim:
To claim this, I am signing this object:
#!/bin/bash | |
# | |
# Configure sshd on MinGW for Windows | |
# Create host keys | |
ssh-keygen -f /etc/ssh/ssh_host_rsa_key -N '' -t rsa | |
ssh-keygen -f /etc/ssh/ssh_host_dsa_key -N '' -t dsa | |
ssh-keygen -f /etc/ssh/ssh_host_ecdsa_key -N '' -t ecdsa | |
ssh-keygen -f /etc/ssh/ssh_host_ed25519_key -N '' -t ed25519 |
Although you can edit ~/.ssh/config file to proxy connections through a Bastion, the only line you need to do ssh tunnel through a bastion (with a .pem file) is:
ssh -i ~/.ssh/file.pem -N -L 8080:internal.host:80 [email protected]
#!/usr/bin/env fish | |
# | |
# Configure autocomplete for fish since aws has it disabled yet | |
# | |
# More info: https://aws.uservoice.com/forums/598381-aws-command-line-interface/suggestions/33168313-autocomplete-for-fish-shell | |
# Credits: https://github.com/otakumike, http://stackoverflow.com/users/808850/scooter-dangle | |
# | |
complete --command aws --no-files --arguments '(begin; set --local --export COMP_SHELL fish; set --local --export COMP_LINE (commandline); aws_completer | sed \'s/ $//\'; end)' |
#!/usr/bin/env bash | |
# | |
# Always find my self creating 1G swapfile for tiny EC2 machines (I know I should rely only on RAM, but the free ssd is there | |
# and it had saved my life in the past), and not getting better at recalling each step. | |
# | |
# Oneliner: | |
# sudo fallocate -l 1G /swapfile ; sudo chmod 600 /swapfile ; sudo mkswap /swapfile ; sudo swapon /swapfile ; echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstab ; swapon --show | |
if [ "$EUID" -ne 0 ] | |
then echo "Please run as root" | |
exit |
#!/usr/bin/env bash | |
virtualhosts=$(awk 'BEGIN { ORS=" " }; /^\s*server_name/ { for ( i=2; i <= NF; i++ ) { serveralias[$i]++ } } END { for ( server in serveralias) { print server } }' /etc/nginx/sites-enabled/* | sed 's/\;//g' | sed 's/ _//g') | |
sed -i "/127\.0\.0\.1/ s/.*/127\.0\.0\.1 localhost $virtualhosts/g" /etc/hosts |
#!/usr/bin/env bash | |
virtualhosts=$(awk 'BEGIN { ORS=" " }; /^\s*server_name/ { for ( i=2; i <= NF; i++ ) { serveralias[$i]++ } } END { for ( server in serveralias) { print server } }' /etc/nginx/sites-enabled/* | sed 's/\;//g' | sed 's/ _//g') | |
sed -i "/127\.0\.0\.1/ s/.*/127\.0\.0\.1 localhost $virtualhosts/g" /etc/hosts |
#!/usr/bin/env bash | |
# Use azure cli to check if the given AKS is private and add its private IP and host to hosts file | |
# Use the command below to download and run this script directly from this gist: | |
# $ curl -sL <raw gist url> | bash -s /subscriptions/<subscription>/resourceGroups/<resource group>/providers/Microsoft.ContainerService/managedClusters/<AKS name> | |
# Use sudo to run it as superuser and let the script edit /etc/hosts file directly: | |
# $ curl -sL <raw gist url> | sudo bash -s /subscriptions/<subscription>/resourceGroups/<resource group>/providers/Microsoft.ContainerService/managedClusters/<AKS name> | |
# The <raw gist url> is the url provided by the GitHub Gist raw button. | |
# Once you checked the code, its safe to run it with sudo since the script cannot be modified for that version's raw gist url |
#!/usr/bin/env bash | |
# Use azure cli to check if the given SQL is private and add its private IP and host to hosts file | |
# Use the command below to download and run this script directly from this gist: | |
# $ curl -sL <raw gist url> | bash -s /subscriptions/<subscription>/resourceGroups/<resource group>/providers/Microsoft.Sql/servers/<SQL name> | |
# Use sudo to run it as superuser and let the script edit /etc/hosts file directly: | |
# $ curl -sL <raw gist url> | sudo bash -s /subscriptions/<subscription>/resourceGroups/<resource group>/providers/Microsoft.Sql/servers/<SQL name> | |
# The <raw gist url> is the url provided by the GitHub Gist raw button. | |
# Once you checked the code, its safe to run it with sudo since the script cannot be modified for that version's raw gist url |