Skip to content

Instantly share code, notes, and snippets.

Which Transformation?
The main criterion in choosing a transformation is: what works with the data? As above examples indicate, it is important to consider as well two questions.
What makes physical (biological, economic, whatever) sense, for example in terms of limiting behaviour as values get very small or very large? This question often leads to the use of logarithms.
Can we keep dimensions and units simple and convenient? If possible, we prefer measurement scales that are easy to think about.
The cube root of a volume and the square root of an area both have the dimensions of length, so far from complicating matters, such transformations may simplify them. Reciprocals usually have simple units, as mentioned earlier. Often, however, somewhat complicated units are a sacrifice that has to be made.
@oxidizeddreams
oxidizeddreams / zsh-dir-stacks
Created November 27, 2018 14:41
zsh directory stack autocompletion
# The directory stack keeps track of the various directories you've visited in a given shell session. Here is a way to make the stack persistent using zsh. The aim is to make it easier to switch between frequently visited directories. The basic idea for this was taken from a 1991 Usenet posting by Uri Guttman. Other bits come from Paul Falstad's Z Shell Guide.
# First, make a new file with the following contents:
#!/usr/bin/env ruby
dirs = []
ARGF.each { |line| dirs.push(line) }
dirs.reverse.each { |line|
@oxidizeddreams
oxidizeddreams / sumologic
Created October 30, 2018 12:27
sumologic cheatsheet
# SumoLogic CheatSheet
# Algorithms
A. Luhn - Luhn’s algorithm to check message logs for strings of numbers that may be credit card numbers and then validates them. It takes a string as an input, strips out all characters that are not numerals, and checks if the resulting string is a valid credit card number, returning true or false accordingly
1. Syntax
- luhn(<field>) [as <field>]
- luhn("<input string>") [as <field>]
@oxidizeddreams
oxidizeddreams / AWS Security Resources
Created October 12, 2018 14:53 — forked from chanj/AWS Security Resources
AWS Security Resources
INTRO
I get asked regularly for good resources on AWS security. This gist collects some of these resources (docs, blogs, talks, open source tools, etc.). Feel free to suggest and contribute.
Short Link: http://tiny.cc/awssecurity
Official AWS Security Resources
* Security Blog - http://blogs.aws.amazon.com/security/
* Security Advisories - http://aws.amazon.com/security/security-bulletins/
* Security Whitepaper (AWS Security Processes/Practices) - http://media.amazonwebservices.com/pdf/AWS_Security_Whitepaper.pdf
* Security Best Practices Whitepaper - http://media.amazonwebservices.com/AWS_Security_Best_Practices.pdf
@oxidizeddreams
oxidizeddreams / helpers.sh
Created October 12, 2018 14:45 — forked from jimfdavies/helpers.sh
AWS CLI helpers
# Security groups that contain 0.0.0.0/0 rules
aws ec2 describe-security-groups --filters Name=ip-permission.cidr,Values=0.0.0.0/0 --output=text | grep SECURITYGROUPS
# Security groups for ElasticSearch
aws ec2 describe-security-groups --filters Name=ip-permission.from-port,Values=9200 --output=text | grep SECURITYGROUPS
# Search last 10,000/1MB of CloudTrail logs for 'AccessDenied' (removed AWS account number from stream name)
aws logs get-log-events --log-group-name CloudTrail/DefaultLogGroup --log-stream-name 000000000000_CloudTrail_eu-west-1 | grep AccessDenied
# Get number of AWS API calls in time period (assumes a Cloudwatch Logs 'catch-all' filter and metric has been created against CloudTrail logs)
@oxidizeddreams
oxidizeddreams / shellRC
Created October 9, 2018 18:54
shellRC snippets
# add key
if [ -z "$SSH_AUTH_SOCK" ] ; then
eval `ssh-agent -s`
ssh-add
fi
@oxidizeddreams
oxidizeddreams / pandas_cheat.py
Created October 7, 2018 21:16 — forked from pohzipohzi/pandas_cheat.py
Cheat sheet for the python pandas library
import numpy as np
import pandas as pd
#### creating dataframes, adding and dropping columns
df = pd.DataFrame(np.arange(1,10).reshape(3,3),['A','B','C'],['w','x','y'])
df.columns = ['W','X','Y'] # change column names
df['Z']=df['X']+df['Y'] # new column with values X+Y
df['XX']=df.apply(lambda row: row['X']*2, axis=1) # new column with values twice of column X
df['YY']=1 # new column of ones
@oxidizeddreams
oxidizeddreams / README.md
Created September 27, 2018 10:46 — forked from leonardofed/README.md
A curated list of AWS resources to prepare for the AWS Certifications


A curated list of AWS resources to prepare for the AWS Certifications

A curated list of awesome AWS resources you need to prepare for the all 5 AWS Certifications. This gist will include: open source repos, blogs & blogposts, ebooks, PDF, whitepapers, video courses, free lecture, slides, sample test and many other resources.


@oxidizeddreams
oxidizeddreams / to_filename.py
Created September 21, 2018 13:20 — forked from wassname/to_filename.py
python convert string to safe filename
import unicodedata
import string
valid_filename_chars = "-_.() %s%s" % (string.ascii_letters, string.digits)
char_limit = 255
def clean_filename(filename, whitelist=valid_filename_chars, replace=' '):
# replace spaces
for r in replace:
filename = filename.replace(r,'_')
@oxidizeddreams
oxidizeddreams / aws_cloudwatch_logs_streams
Created September 18, 2018 15:13
stuff for aws cloudwatch logs, groups and queries
You can achieve this through using --query to target the results of describe-log-streams. This allows you to loop through and delete the results.
aws logs describe-log-streams --log-group-name $LOG_GROUP_NAME --query 'logStreams[*].logStreamName' --output table | awk '{print $2}' | grep -v ^$ | while read x; do aws logs delete-log-stream --log-group-name $LOG_GROUP_NAME --log-stream-name $x; done
You can use --query to target all or specific groups or streams.
Delete streams from a specific month
aws logs describe-log-streams --log-group-name $LOG_GROUP --query 'logStreams[?starts_with(logStreamName,`2017/07`)].logStreamName' --output table | awk '{print $2}' | grep -v ^$ | while read x; do aws logs delete-log-stream --log-group-name $LOG_GROUP --log-stream-name $x; done