Skip to content

Instantly share code, notes, and snippets.

@nsabharwal
nsabharwal / servicedeftest.txt
Created June 14, 2020 00:49
service def test
[ec2-user@ip-172-31-3-208 nifi]$ cat privacera_nifi.json
{
"isEnabled": "true",
"type": "nifi",
"configs": {
"username": "nifi"
},
"name": "privacera_nifi",
"description": "nifi repo"
}
@nsabharwal
nsabharwal / gist:58d2379023407ed2857d4957ad29fb77
Created February 15, 2020 16:54
HDP install step - notes
neeraj_mac:emrfs-cf neerajsab$ ssh -i ~/.ssh/ns-privacera-prod.pem [email protected]
[centos@ip-172-31-41-233 ~]$ ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (/home/centos/.ssh/id_rsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/centos/.ssh/id_rsa.
Your public key has been saved in /home/centos/.ssh/id_rsa.pub.
The key fingerprint is:
SHA256:Y3A4qhJucOe2XXxzpdtD6OFcnlCRAYYPPNArIniyUBs [email protected]
@nsabharwal
nsabharwal / S3demo
Created September 18, 2019 05:23
s3 demo
import subprocess,json,sys
def runAndGet(cmd,parse=False):
print(cmd)
dummy=raw_input()
try:
data = subprocess.check_output(cmd, shell=True)
except:
sys.exit(0)
import subprocess,json,sys
def runAndGet(cmd,parse=False):
print(cmd)
dummy=raw_input()
try:
data = subprocess.check_output(cmd, shell=True)
except:
@nsabharwal
nsabharwal / cassandra advrep
Last active October 1, 2016 22:00
cassandra advrep
##Start cluster
/Users/neeraj/dse/bin/dse cassandra
## - Stop cluster
/Users/neeraj/dse/bin/dse cassandra-stop
## enable advanced replication
vi /Users/neeraj/dse/resources/dse/conf/dse.yaml
./nodetool status
export CQLSH_HOST=192.168.1.185
./cqlsh
# run this in edge and hub cluster
@nsabharwal
nsabharwal / Druid setup
Last active May 26, 2016 22:06
Druid setup
## git clone https://github.com/Banno/druid-docker
## git clone https://github.com/zcox/druid-pageviews
cd druid-docker
fig kill && fig rm --force
./build.sh
fig up -d druid
#### docker
## `boot2docker shellinit`
## docker ps
##
su - hdfs
git clone https://github.com/pivotalsoftware/pivotal-samples.git
# I copied the files on /mnt
cd /mnt/pivotal-samples/sample-data
# load data into HDFS
sh -x load_data_to_HDFS.sh ## it will show you the output of the commands the script is running
hdfs dfs -ls /retail_demo
#### Hive table #####
https://github.com/pivotalsoftware/pivotal-samples/blob/master/hive/create_hive_tables.sql
# install
export MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.dylib
git clone https://github.com/mesos/chronos.git
cd chronos
#must have node installed
mvn package
#start on port 8081
java -cp target/chronos*.jar org.apache.mesos.chronos.scheduler.Main --master 127.0.0.1:5050 --zk_hosts localhost:2181 --http_port 8081
#list jobs
curl -L -X GET localhost:8081/scheduler/jobs
@nsabharwal
nsabharwal / Marathon
Created May 17, 2016 00:44
Marathon
# start loacal zk
/var/root/zookeeper-3.4.8/bin/zkServer.sh start
#start marathon
#marathon is connecting to the existing mesos instance instance connecting to it's own embeded instance
#--master 127.0.0.1:5050 and zk://localhost:2181/mesos
MESOS_NATIVE_JAVA_LIBRARY=/Users/nsabharwal/mesos-0.28.1/build/src/.libs/libmesos.dylib
/var/root/marathon-1.1.1/bin/start --master 127.0.0.1:5050 --zk zk://localhost:2181/mesos
##########################################################
#run mesos
#spin up zk instance
#start marathon
MESOS_NATIVE_JAVA_LIBRARY=/Users/nsabharwal/mesos-0.28.1/build/src/.libs/libmesos.dylib ./bin/start --master local --zk zk://localhost:2181/marathon