This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
1. under Ambari-> knox -> Advanced topologies | |
2. add following snippet in advanced topologies | |
<provider> | |
<role>ha</role> | |
<name>HaProvider</name> | |
<enabled>true</enabled> | |
<param> | |
<name>HIVE</name> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$wget https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.bz2 | |
$tar xvf protobuf-2.5.0.tar.bz2 | |
$cd protobuf-2.5.0 | |
$./configure CC=clang CXX=clang++ CXXFLAGS='-std=c++11 -stdlib=libc++ -O3 -g' LDFLAGS='-stdlib=libc++' LIBS="-lc++ -lc++abi" | |
$make -j 4 | |
$sudo make install | |
$protoc --version |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mkdir spark-streaming-example | |
cd spark-streaming-example/ | |
mkdir -p src/main/scala | |
cd src/main/scala | |
vim TestStreaming.scala | |
add following line of code to TestStreaming.scala | |
import org.apache.spark.streaming.StreamingContext | |
import org.apache.spark.streaming.StreamingContext._ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
JPam is a Java-PAM bridge. PAM, or Pluggable Authentication Modules, is a standard security architecture used on Linux, Mac OS X, Solaris, HP-UX and other Unix systems. JPam is the missing link between the two. | |
JPAM permits the use of PAM authentication facilities by Java applications running on those platforms. | |
These facilities include: | |
account | |
auth | |
password | |
session |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[root@rksnode ~]# curl http://rksnode:21000/api/atlas/admin/version | |
{"Version":"0.5.0.2.4.2.0-258-r375e15d6e3442c484b3a095a80127e41abef40b5","Name":"apache-atlas","Description":"Metadata Management and Data Governance Platform over Hadoop"}[root@rksnode ~]# | |
[root@rksnode ~]# | |
[root@rksnode ~]# | |
[root@rksnode ~]# curl http://rksnode:21000/api/atlas/types | |
{"results":["DataSet","hive_order","Process","hive_table","hive_db","hive_process","hive_principal_type","hive_resource_type","hive_object_type","Infrastructure","hive_index","hive_column","hive_resourceuri","hive_storagedesc","hive_role","hive_partition","hive_serde","hive_type"],"count":18,"requestId":"qtp1286783232-60 - 0128be6a-076e-4ad3-972a-58783a1f7180"}[root@rksnode ~]# | |
[root@rksnode ~]# | |
[root@rksnode ~]# curl http://rksnode:21000/api/atlas/types/hive_process | |
{"typeName":"hive_process","definition":"{\n \"enumTypes\":[\n \n ],\n \"structTypes\":[\n \n ],\n \"traitTypes\":[\n \n ],\n \"classTypes\":[\n {\n \"superTypes\":[\n |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
hive> set hive.support.concurrency; | |
hive.support.concurrency=true | |
hive> set hive.enforce.bucketing; | |
hive.enforce.bucketing=true | |
hive> set hive.exec.dynamic.partition.mode; | |
hive.exec.dynamic.partition.mode=nonstrict | |
hive> set hive.txn.manager; | |
hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager | |
hive> set hive.compactor.initiator.on; | |
hive.compactor.initiator.on=true |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mkdir Spark2StarterApp | |
cd Spark2StarterApp/ | |
mkdir -p src/main/scala | |
cd src/main/scala | |
vim Spark2Example.scala | |
import org.apache.spark.sql.SparkSession | |
object Spark2Example { |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[root@rkk1 Spark2StarterApp]# /usr/hdp/current/spark2-client/bin/spark-shell | |
Setting default log level to "WARN". | |
To adjust logging level use sc.setLogLevel(newLevel). | |
16/11/30 18:01:48 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. | |
Spark context Web UI available at http://172.26.81.127:4040 | |
Spark context available as 'sc' (master = local[*], app id = local-1480528906336). | |
Spark session available as 'spark'. | |
Welcome to | |
____ __ | |
/ __/__ ___ _____/ /__ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
End-to-end Latency | |
0.0543 ms ms (median) | |
0.003125 ms (99th percentile) | |
5 ms (99.9th percentile) | |
Producer and consumer | |
Producer - 1431170.2 records/sec (136.49 MB/sec) | |
Consumer - 3276754.7021 records/sec (312.4957 MB/sec) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import com.google.common.io.Resources; | |
import org.apache.kafka.clients.producer.KafkaProducer; | |
import org.apache.kafka.clients.producer.ProducerRecord; | |
import org.apache.kafka.clients.producer.RecordMetadata; | |
import org.apache.log4j.Logger; | |
import java.io.IOException; | |
import java.io.InputStream; | |
import java.util.Properties; |