This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
yum list installed | grep -i "graphite\|carbon\|whisper" | |
graphite-web.noarch 0.9.12-5.el6 @epel | |
graphite-web-selinux.noarch 0.9.12-5.el6 @epel | |
python-carbon.noarch 0.9.12-3.el6.1 @epel | |
python-whisper.noarch 0.9.12-1.el6 @epel | |
Graphite Install | |
1. Install dependencies | |
ansible-playbook -i hosts update_yum.yml |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
# TestDFS will be performed with the total file size of 1TB using different dfs.block.size variations. | |
# Usage: TestDFSIO [genericOptions] -read | -write | -append | -clean [-nrFiles N] [-fileSize Size[B|KB|MB|GB|TB]] [-resFile resultFileName] [-bufferSize Bytes] [-rootDir] | |
# | |
# The test is designed with two variables | |
# 1) file_sizes_mb: file size variation with 1GB file x 1,000 = 1TB and 100MB file x 10,000 = 1TB | |
# this is to test large file and small file impact on HDFS | |
# 2) dfs.block.size (MB) variation: 512, 256, 128, 50 10 | |
# this is to test impact of different block sizes. | |
# |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
# terasort benchmark | |
# Usage: hadoop jar hadoop-*examples*.jar teragen <number of 100-byte rows> <output dir> | |
# | |
# command to run nohup | |
# nohup bash ./run_terasort.sh > terasort.out 2>&1 & | |
# sudo -u hdfs nohup bash /tmp/run_terasort.sh > /tmp/terasort.out 2>&1 & | |
hadoop_jar=/opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce/hadoop-examples.jar | |
# TeraGen: 1TB = 1,000,000,000,000 = 1e12 BYTE = 100 BYTE * 1e10 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
# mapreduce pi calculation to validate hadoop cluster setup | |
# | |
# command to run nohub | |
# nohup bash ./run_pi_job.sh > pi_job.out 2>&1 & | |
# sudo -u hdfs nohup bash /tmp/run_pi_job.sh > /tmp/pi_job.out 2>&1 & | |
#parcel | |
hadoop_jar=/opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce/hadoop-examples.jar |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ntp ref: | |
------------------------------ | |
http://serverfault.com/questions/204082/using-ntp-to-sync-a-group-of-linux-servers-to-a-common-time-source/204138#204138 | |
http://www.ntp.org/ntpfaq/NTP-s-config-adv.htm | |
http://askubuntu.com/questions/14558/how-do-i-setup-a-local-ntp-server | |
http://www.thegeekstuff.com/2014/06/linux-ntp-server-client/ | |
http://www.linuxsolutions.org/faqs/generic/ntpserver | |
https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/6/html/Deployment_Guide/s1-Understanding_the_ntpd_Configuration_File.html | |
------------------------------ |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#1. create a sample avro schema | |
cat > example.avsc << EOF | |
{"namespace": "example.avro", | |
"type": "record", | |
"name": "User", | |
"fields": [ | |
{"name": "name", "type": "string"}, | |
{"name": "favorite_number", "type": ["int", "null"]}, | |
{"name": "favorite_color", "type": ["string", "null"]} | |
] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<build> | |
<plugins> | |
<plugin> | |
<groupId>org.apache.maven.plugins</groupId> | |
<artifactId>maven-shade-plugin</artifactId> | |
<executions> | |
<execution> | |
<phase>package</phase> | |
<goals> | |
<goal>shade</goal> |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env bash | |
# To enable debugging. Change debug to 1. This will not delete the temporary hosts file | |
debug=0 | |
## User defined arguments | |
user="admin" | |
pass="admin" | |
## Hostname of the CM instance here: | |
scm="http://test-1.wonderland.com:7180/api/v6" | |
## Cluster name here. Replace spaces w/ %20 to comply w/ HTTP rules |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
----- | |
for p in `kadmin.local -q listprincs` ; do kadmin.local -q "modprinc -maxrenewlife 1000days $p" ; done | |
----- | |
kadmin.local -q "getprincs" > principals.txt | |
vi principals.txt | |
reemove the non-Hadoop principals from the principals.txt file, and then run this small script to update the existing principals: | |
for princ in `cat principals.txt`; do kadmin.local -q "modprinc -maxrenewlife 7day $princ"; done; | |
service krb5kdc restart |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
http://answers.oreilly.com/topic/460-how-to-benchmark-a-hadoop-cluster/ | |
http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/ | |
## MR pi | |
https://gist.github.com/jeongho/371aaed47ab462d79851 | |
## Terasort | |
https://gist.github.com/jeongho/3b8c028f5e8409c3a10a | |
## TestDFSIO |