Skip to content

Instantly share code, notes, and snippets.

@wavescholar
Last active August 29, 2015 14:06
Show Gist options
  • Save wavescholar/5e1b9f98baae2c95278c to your computer and use it in GitHub Desktop.
Save wavescholar/5e1b9f98baae2c95278c to your computer and use it in GitHub Desktop.
CDH5 On Fedora 20 -
First This
https://gist.github.com/wavescholar/6cc708de5f9bea623c86
Get The RPM
http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH5/latest/CDH5-Quick-Start/cdh5qs_yarn_pseudo.html
sudo yum --nogpgcheck localinstall cloudera-cdh-5-0.x86_64.rpm
Install the CDH - key first then the buisiness;
rpm --import http://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/RPM-GPG-KEY-cloudera
sudo yum install hadoop-conf-pseudo
To view the files on Red Hat or SLES systems:
rpm -ql hadoop-conf-pseudo
Format the name node
sudo -u hdfs hdfs namenode -format
Step 3: Create the /tmp, Staging and Log Directories
Remove the old /tmp if it exists:
$ sudo -u hdfs hadoop fs -rm -r /tmp
Create the new directories and set permissions:
$ sudo -u hdfs hadoop fs -mkdir -p /tmp/hadoop-yarn/staging/history/done_intermediate
$ sudo -u hdfs hadoop fs -chown -R mapred:mapred /tmp/hadoop-yarn/staging
$ sudo -u hdfs hadoop fs -chmod -R 1777 /tmp
$ sudo -u hdfs hadoop fs -mkdir -p /var/log/hadoop-yarn
$ sudo -u hdfs hadoop fs -chown yarn:mapred /var/log/hadoop-yarn
--------------------------------MY history
yum --nogpgcheck localinstall cloudera-cdh-5-0.x86_64.rpm
rpm --import http://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/RPM-GPG-KEY-cloudera
sudo yum install hadoop-conf-pseudo
rpm -ql hadoop-conf-pseudo
sudo -u hdfs hdfs namenode -format
for x in `cd /etc/init.d ; ls hadoop-hdfs-*` ; do sudo service $x start ; done
sudo -u hdfs hadoop fs -rm -r /tmp
sudo -u hdfs hadoop fs -mkdir -p /tmp/hadoop-yarn/staging/history/done_intermediate
sudo -u hdfs hadoop fs -chown -R mapred:mapred /tmp/hadoop-yarn/staging
sudo -u hdfs hadoop fs -chmod -R 1777 /tmp
sudo -u hdfs hadoop fs -mkdir -p /var/log/hadoop-yarn
sudo -u hdfs hadoop fs -chown yarn:mapred /var/log/hadoop-yarn
sudo -u hdfs hadoop fs -ls -R /
sudo service hadoop-yarn-resourcemanager start
sudo service hadoop-yarn-nodemanager start
sudo service hadoop-mapreduce-historyserver start
-----------------------------------------------------------------------------not this you get error running example
sudo -u hdfs hadoop fs -mkdir /tmp/input
sudo -u hdfs hadoop fs -mkdir -p /home/bcampbell/input
sudo -u hdfs hadoop fs -mkdir /home/bcampbell/hadoop
sudo -u hdfs hadoop fs -chown bcampbell /home/bcampbell/input/
sudo -u hdfs hadoop fs -chown bcampbell /home/bcampbell/hadoop
---------------------------------------------------------------------------do this
[root@localhost bcampbell]# sudo -u hdfs hadoop fs -mkdir /user
[root@localhost bcampbell]# sudo -u hdfs hadoop fs -mkdir /user/bcampbell
[root@localhost bcampbell]# sudo -u hdfs hadoop fs -mkdir /user/bcampbell/input
[root@localhost bcampbell]# sudo -u hdfs hadoop fs -chown bcampbell /user/bcampbell
[root@localhost bcampbell]# sudo -u hdfs hadoop fs -chown bcampbell /user/bcampbell/input
------------------------------------------------------------------------
As bcampbell
hadoop fs -copyFromLocal /home/bcampbell/Downloads/Milton_ParadiseLost.txt /home/bcampbell/input
hdfs fsck /home/bcampbell/input/ -files -blocks
hadoop fs -copyFromLocal /home/bcampbell/Downloads/WilliamYeats.txt /home/bcampbell/input
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment