Skip to content

Instantly share code, notes, and snippets.

View seanorama's full-sized avatar

Sean Roberts seanorama

  • Cloudera (formerly Hortonworks, Rackspace)
  • London, UK
  • X @seano
View GitHub Profile
@seanorama
seanorama / README.md
Last active April 19, 2019 16:39
ambari-hdp31-stack-patches

Patch to the HDP stack in Ambari 2.7

On Ambari Server

wget https://gist.github.com/seanorama/99d13ca1898b33d6ebd1322193062b5d/raw/f5f43b6bac31e1f44f2320d72fb2d1ead651272c/hdp30_alert_hive_interactive_thrift_port.py.patch
sudo patch -b -d/ -p0 < hdp30_alert_hive_interactive_thrift_port.py.patch

wget https://gist.github.com/seanorama/99d13ca1898b33d6ebd1322193062b5d/raw/f5f43b6bac31e1f44f2320d72fb2d1ead651272c/hdp30_alert_spark2_thrift_port.patch
sudo patch -b -d/ -p0 &lt; hdp30_alert_spark2_thrift_port.patch
@seanorama
seanorama / README.md
Last active December 18, 2018 09:44
hadoop-tmp-noexec

Fixes for when /tmp is mounted with noexec

Applies to HDP (Hortonworks Data Platform) but likely relevant for other Hadoop distributions.

This is not complete. Adding to it as issues are found.

Prep

  1. Create an alternative tmp directory:

SSSD Configuration

What I use for Hortonworks HDP (Hadoop) systems, but should work for anyone.

Some configurations are tuned for Active Directory without relying on 'sssd-ad' such that the hosts don't need to join the domain.

Install requirements

sudo yum install sssd sssd-ldap sssd-krb5 sssd-tools authconfig \
  oddjob oddjob-mkhomedir openldap-clients cyrus-sasl-gssapi \

The Oozie Web UI requires ExtJS which is not provided by default as of HDP (Hortonworks Data Platform) 2.6.5.

## Enable HDP-UTILS-GPL
##   Note: This will break with HDP upgrades!
cd /etc/yum.repos.d/
sudo wget https://public-repo-1.hortonworks.com/HDP-UTILS-GPL-1.1.0.22/repos/centos7/hdp-utils-gpl.repo

## Install ExtJS for Oozie
## Note: This must be updated manually and presents a security hole if not updated.
## No longer using this but keeping for posterity
## Functions so we don't repeat ourselves (DRY)
function lv_name(){
[[ "$1" ]] || { echo "Error: Missing query" >&2; return 1; }
export lv_name="lv_$(echo ${path} | sed -r -e 's,^/,,' -e 's,/$,,' -e 's,/,_,g')"
}
function my_mount_lvm(){
[[ "$1" ]] || { echo "Error: Missing path" >&2; return 1; }
@seanorama
seanorama / httpfs.md
Last active November 15, 2018 15:20
httpfs

httpfs installation with kerberos

From Ambari, update core-site with:

hadoop.proxyuser.httpfs.groups=the-groups-it-can-impersonate
hadoop.proxyuser.httpfs.hosts=the-hosts-with-httpfs

On KDC master, create kerberos principals for each httpfs host

@seanorama
seanorama / flux-adjust-color.md
Created November 2, 2018 03:23
flux-adjust-color.md

How to manually adjust color for f.lux on macos

Including colors below the default allowed 1200k.

Why: So we can elimante green along with blue light as I'm doing here with 900k.

## adjust night time temp
defaults write org.herf.Flux lateColorTemp -int 900
@seanorama
seanorama / cluster-shells.md
Last active July 5, 2019 14:53
cluster-shell-tools

Cluster shelling with pssh or tmux-cssh

## populate hosts files. With IP or hostnames.
tee <<-'EOF' ~/.hosts >/dev/null
ip-10-42-4-10.ec2.internal
ip-10-42-4-30.ec2.internal
ip-10-42-4-31.ec2.internal
ip-10-42-4-32.ec2.internal
ip-10-42-4-33.ec2.internal
ip-10-42-4-50.ec2.internal
@seanorama
seanorama / README.md
Last active October 2, 2019 05:21
Knox with PAM
  1. Create PAM Cloudera file:
sudo tee /etc/pam.d/cloudera> /dev/null <<-'EOF'
#%PAM-1.0
auth    sufficient        pam_unix.so
auth    sufficient        pam_sss.so
account sufficient        pam_unix.so
account sufficient pam_sss.so
@seanorama
seanorama / README.md
Last active August 7, 2019 13:45
holland-backup-to-hdfs

Backup MySQL/MariaDB with holland backup

Including copying the backup to HDFS. Uses Kerberos by default, but you can strip that out.

  • If your environment is not using kerberos, remove the kinit/destroy commands.
  • Works with any holland module but this shows mysqldump.
  • Assumes credentials are in /root/.my.cnf.

Kinit to Smokeuser or whichever user you want the backup to run as in HDFS