Skip to content

Instantly share code, notes, and snippets.

@xman1980
xman1980 / gist:4dc6fd5acd726395a3e4f74993271717
Created October 9, 2016 13:15
SSL_certificate_comodo_rapid
Setting up a SSL Cert from Comodo
I use Namecheap.com as a registrar, and they resale SSL Certs from a number of other companies, including Comodo.
These are the steps I went through to set up an SSL cert.
Purchase the cert
Prior to purchasing a cert, you need to generate a private key, and a CSR file (Certificate Signing Request). You'll be asked for the content of the CSR file when ordering the certificate.
https://launchpad.net/mysql-tuning-primer
hdfs fsck -list-corruptfileblocks
echo "PATH=${PATH}:/opt/scripts" > /etc/profile.d/scripts-path.sh && chmod 755 /etc/profile.d/scripts-path.sh
@xman1980
xman1980 / gist:e2b74112637a750e6c7f604bec703180
Created January 24, 2017 09:37
nice-count-memory-process-ps-aux
ps -eo size,pid,user,command --sort -size | awk '{ hr=$1/1024 ; printf("%13.2f Mb ",hr) } { for ( x=4 ; x<=NF ; x++ ) { printf("%s ",$x) } print "" }'
@xman1980
xman1980 / gist:3f7f78b2982779b5eddd7a1d174d6a0e
Created March 27, 2017 17:17
hue_useradmin_sync_with_unix
# get the current HUE credential password for CM
cat /proc/`ps -ef| grep hue |\
grep runcherrypyserver |\
awk '{print $2}'`/environ |\
sed 's/.*\(HADOOP_CREDSTORE_PASSWORD=.*\)/\1/'
export HUE_CONF_DIR="/var/run/cloudera-scm-agent/process/`ls -alrt /var/run/cloudera-scm-agent/process | grep HUE | tail -1 | awk '{print $9}'`"
export JAVA_HOME=/usr/java/default
export HADOOP_CREDSTORE_PASSWORD=xxxxxxxxxxxxxxx
build/env/bin/hue useradmin_sync_with_unix
@xman1980
xman1980 / pruneStagingDirs.sh
Created March 28, 2017 16:41
A loop to remove old staging dirs. This is a part of the workaround for a bug tracked here: https://issues.apache.org/jira/browse/MAPREDUCE-5351
#!/bin/bash
NOW=`date +%s`
SIXHOURSAGO=`echo "$NOW - 21600" |bc`
HADOOPBIN="/usr/bin/hadoop"
IFS=$'\n'
for i in `$HADOOPBIN fs -ls /user/root/.staging/`; do
IFS=' '
JOBDATE=`echo $i|awk '{print $6" "$7}'`
If you happen to be using CDH distribution of Hadoop, it comes with a very useful HdfsFindTool command, which behaves like Linux's find command.
If you're using the default parcels information, here's how you'd do it:
hadoop jar /opt/cloudera/parcels/CDH/jars/search-mr-*-job.jar \
org.apache.solr.hadoop.HdfsFindTool -find PATH -mtime +N
Where you'd replace PATH with the search path and N with number of days.
@xman1980
xman1980 / bash
Created May 3, 2017 07:50
backup hdfs fsimage
#!/bin/bash
#######################
# Backup the filesystem metadata for hadoop using namenode url
# Change the variables to match appropriate env and put this script in crontab
#######################
#Variables
TODAY=$(date +"%Y-%m-%d-%H%M") #date and time
BACKUP_PATH="/home/backup/hadoop/fsimage" #path to store metadata
RT_DAYS="4" #Rentention in days
@xman1980
xman1980 / bash
Created May 3, 2017 08:00
Finding directories older than N days in HDFS
#!/bin/bash
usage="Usage: dir_diff.sh [days]"
if [ ! "$1" ]
then
echo $usage
exit 1
fi