Lets assume that you have a cluster with the name - awesome_cluster.
On a fresh Ambari cluster, we need to follow following steps to create the HDFS view.
Well... Ambari tries to impersonate the current logged in user with the superuser, and thus the simplest ( but not the best ) thing is to allow the superuser to imperonate all users. Similarily we also need to tell Ambari the name of the host from which the superuser can connect ( simplest is to allow all hosts ).
On Ambari dashboard go to Services > HDFS > Config > Advanced > Custom Core Site and add following new paramters
hadoop.proxyuser.root.groups = *
hadoop.proxyuser.root.hosts = *
Now you will have to create a corresponding user ( in this case 'admin' ) on your hdfs,
sudo su hdfs
hadoop fs -mkdir /user/admin
hadoop fs -chown admin:hadoop /user/admin
Similarly... create dfs home directory for any other users ( awesome.user )
sudo su hdfs
hadoop fs -mkdir /user/awesome.user
hadoop fs -chown awesome.user:hadoop /user/aswesome.user
GO to admin > manage ambari > Views > HDFS > Create Instance.
We will need to change following fields,
| Field name | Value |
|---|---|
| Instance Name | awesome_hdfs_view |
| Display Name | AWESOME_HDFS_VIEW |
| Description | My awesome view for awseome hdfs |
Its fine to create with other things as default.
The admin user will have permission for this view by default, as for other users ( awsome.user ) follow these steps.
- Go to
admin > manage ambari > User + Group Management > Groupsand create a group namedhdfs_users. - Now go to
admin > manage ambari > Views > HDFS > AWESOME_HDFS_VIEW > Permissionsand grant access to grouphdfs_users. - Now go to
admin > manage ambari > User + Group Management > Groups > hdfs_usersand addawesome.userto local users of this group.
Ambari tries to impersonate the current logged in user with the hive superuser, and thus the simplest ( but not the best ) thing is to allow the hive superuser to imperonate all users. Similarily we also need to tell Ambari the name of the host from which the hive superuser can connect ( simplest is to allow all hosts ).
On Ambari dashboard go to Services > HDFS > Config > Advanced > Custom Core Site and add following new paramters
hadoop.proxyuser.hive.groups = *
hadoop.proxyuser.hive.hosts = *
GO to admin > manage ambari > Views > Hive > Create Instance.
We will need to change following fields,
| Field name | Value |
|---|---|
| Instance Name | awesome_hive_view |
| Display Name | AWESOME_HIVE_VIEW |
| Description | My awesome view for awseome hive |
Its fine to create with other things as default.
The admin user will have permission for this view by default, as for other users ( awsome.user ) follow these steps.
- Go to
admin > manage ambari > User + Group Management > Groupsand create a group namedhive_users. - Now go to
admin > manage ambari > Views > HDFS > AWESOME_HIVE_VIEW > Permissionsand grant access to grouphive_users. - Now go to
admin > manage ambari > User + Group Management > Groups > hive_usersand addawesome.userto local users of this group.
Logically... your Hive view should have been usable now... but actually it is not. Your admin user will work but somehow other users will not be able to use the hive view.
- The simplest way to solve this problem is to grant the respective user read-only permission to Ambari dashboard and then login to Ambari dashboard at least once. (
admin > manage ambari > clusters > awseom_cluster > Permissions > Read-Only) You can remove this Ambari dashboard permission later. - Other way is to login to hive-shell using the respective user.