Skip to content

Instantly share code, notes, and snippets.

@weldpua2008
Created July 18, 2018 13:07
Show Gist options
  • Save weldpua2008/dce46b3d14a8d8fe82231adb81d4167c to your computer and use it in GitHub Desktop.
Save weldpua2008/dce46b3d14a8d8fe82231adb81d4167c to your computer and use it in GitHub Desktop.

HDFS

checking for under replicated blocks

sudo -u hdfs  hdfs fsck / | grep 'Under replicated' | wc -l

ls

## Exit Code:
## Returns 0 on success and -1 on error.

[root@hdp-m-01 ~]# sudo -u hdfs  hdfs dfs -ls  /user/user/share/lib/uber-pipelines-0.0.24-SNAPSHOT.jar
-rw-r--r--   3 user user   33556147 2018-02-15 07:39 /user/user/share/lib/uber-pipelines-0.0.24-SNAPSHOT.jar

https://hadoop.apache.org/docs/r2.4.1/hadoop-project-dist/hadoop-common/FileSystemShell.html#ls

Create user

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-views/content/setup_HDFS_user_directory_pig_view.html

  • Connect to a host in the cluster that includes the HDFS client.
  • Switch to the hdfs system account user. su - hdfs
  • Using the HDFS client, make an HDFS directory for the user. For example, if your username is admin, you would create the following directory. hadoop fs -mkdir /user/admin
  • Set the ownership on the newly created directory. For example, if your username is admin, you would make that user the directory owner. hadoop fs -chown admin:hadoop /user/admin
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment