Created
January 3, 2017 05:03
-
-
Save 1ambda/421551654dbad698f7c80ada3934b994 to your computer and use it in GitHub Desktop.
spark docker packages
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
bash-4.1# | |
j-api6.2,TargetHolding:pyspark-cassandra:0.3.5 --exclude-packages org.slf4j:slf4 | |
Ivy Default Cache set to: /root/.ivy2/cache | |
The jars for the packages stored in: /root/.ivy2/jars | |
:: loading settings :: url = jar:file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
com.datastax.spark#spark-cassandra-connector_2.10 added as a dependency | |
TargetHolding#pyspark-cassandra added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found com.datastax.spark#spark-cassandra-connector_2.10;1.6.2 in central | |
found joda-time#joda-time;2.3 in central | |
found com.twitter#jsr166e;1.1.0 in central | |
found io.netty#netty-all;4.0.33.Final in central | |
found org.joda#joda-convert;1.2 in central | |
found org.scala-lang#scala-reflect;2.10.5 in central | |
found TargetHolding#pyspark-cassandra;0.3.5 in spark-packages | |
found com.datastax.spark#spark-cassandra-connector-java_2.10;1.6.0-M1 in central | |
found org.apache.cassandra#cassandra-clientutil;3.0.2 in central | |
found com.datastax.cassandra#cassandra-driver-core;3.0.0 in central | |
found io.netty#netty-handler;4.0.33.Final in central | |
found io.netty#netty-buffer;4.0.33.Final in central | |
found io.netty#netty-common;4.0.33.Final in central | |
found io.netty#netty-transport;4.0.33.Final in central | |
found io.netty#netty-codec;4.0.33.Final in central | |
found io.dropwizard.metrics#metrics-core;3.1.2 in central | |
found org.apache.commons#commons-lang3;3.3.2 in central | |
found com.google.guava#guava;16.0.1 in central | |
downloading https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.10/1.6.2/spark-cassandra-connector_2.10-1.6.2.jar ... | |
[SUCCESSFUL ] com.datastax.spark#spark-cassandra-connector_2.10;1.6.2!spark-cassandra-connector_2.10.jar (562ms) | |
downloading http://dl.bintray.com/spark-packages/maven/TargetHolding/pyspark-cassandra/0.3.5/pyspark-cassandra-0.3.5.jar ... | |
[SUCCESSFUL ] TargetHolding#pyspark-cassandra;0.3.5!pyspark-cassandra.jar (457ms) | |
downloading https://repo1.maven.org/maven2/joda-time/joda-time/2.3/joda-time-2.3.jar ... | |
[SUCCESSFUL ] joda-time#joda-time;2.3!joda-time.jar (102ms) | |
downloading https://repo1.maven.org/maven2/com/twitter/jsr166e/1.1.0/jsr166e-1.1.0.jar ... | |
[SUCCESSFUL ] com.twitter#jsr166e;1.1.0!jsr166e.jar (79ms) | |
downloading https://repo1.maven.org/maven2/io/netty/netty-all/4.0.33.Final/netty-all-4.0.33.Final.jar ... | |
[SUCCESSFUL ] io.netty#netty-all;4.0.33.Final!netty-all.jar (205ms) | |
downloading https://repo1.maven.org/maven2/org/joda/joda-convert/1.2/joda-convert-1.2.jar ... | |
[SUCCESSFUL ] org.joda#joda-convert;1.2!joda-convert.jar (84ms) | |
downloading https://repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.10.5/scala-reflect-2.10.5.jar ... | |
[SUCCESSFUL ] org.scala-lang#scala-reflect;2.10.5!scala-reflect.jar (279ms) | |
downloading https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector-java_2.10/1.6.0-M1/spark-cassandra-connector-java_2.10-1.6.0-M1.jar ... | |
[SUCCESSFUL ] com.datastax.spark#spark-cassandra-connector-java_2.10;1.6.0-M1!spark-cassandra-connector-java_2.10.jar (90ms) | |
downloading https://repo1.maven.org/maven2/org/apache/cassandra/cassandra-clientutil/3.0.2/cassandra-clientutil-3.0.2.jar ... | |
[SUCCESSFUL ] org.apache.cassandra#cassandra-clientutil;3.0.2!cassandra-clientutil.jar (77ms) | |
downloading https://repo1.maven.org/maven2/com/datastax/cassandra/cassandra-driver-core/3.0.0/cassandra-driver-core-3.0.0.jar ... | |
[SUCCESSFUL ] com.datastax.cassandra#cassandra-driver-core;3.0.0!cassandra-driver-core.jar(bundle) (129ms) | |
downloading https://repo1.maven.org/maven2/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar ... | |
[SUCCESSFUL ] org.apache.commons#commons-lang3;3.3.2!commons-lang3.jar (114ms) | |
downloading https://repo1.maven.org/maven2/com/google/guava/guava/16.0.1/guava-16.0.1.jar ... | |
[SUCCESSFUL ] com.google.guava#guava;16.0.1!guava.jar(bundle) (230ms) | |
downloading https://repo1.maven.org/maven2/io/netty/netty-handler/4.0.33.Final/netty-handler-4.0.33.Final.jar ... | |
[SUCCESSFUL ] io.netty#netty-handler;4.0.33.Final!netty-handler.jar (101ms) | |
downloading https://repo1.maven.org/maven2/io/dropwizard/metrics/metrics-core/3.1.2/metrics-core-3.1.2.jar ... | |
[SUCCESSFUL ] io.dropwizard.metrics#metrics-core;3.1.2!metrics-core.jar(bundle) (97ms) | |
downloading https://repo1.maven.org/maven2/io/netty/netty-buffer/4.0.33.Final/netty-buffer-4.0.33.Final.jar ... | |
[SUCCESSFUL ] io.netty#netty-buffer;4.0.33.Final!netty-buffer.jar (76ms) | |
downloading https://repo1.maven.org/maven2/io/netty/netty-transport/4.0.33.Final/netty-transport-4.0.33.Final.jar ... | |
[SUCCESSFUL ] io.netty#netty-transport;4.0.33.Final!netty-transport.jar (107ms) | |
downloading https://repo1.maven.org/maven2/io/netty/netty-codec/4.0.33.Final/netty-codec-4.0.33.Final.jar ... | |
[SUCCESSFUL ] io.netty#netty-codec;4.0.33.Final!netty-codec.jar (133ms) | |
downloading https://repo1.maven.org/maven2/io/netty/netty-common/4.0.33.Final/netty-common-4.0.33.Final.jar ... | |
[SUCCESSFUL ] io.netty#netty-common;4.0.33.Final!netty-common.jar (92ms) | |
:: resolution report :: resolve 21912ms :: artifacts dl 3036ms | |
:: modules in use: | |
TargetHolding#pyspark-cassandra;0.3.5 from spark-packages in [default] | |
com.datastax.cassandra#cassandra-driver-core;3.0.0 from central in [default] | |
com.datastax.spark#spark-cassandra-connector-java_2.10;1.6.0-M1 from central in [default] | |
com.datastax.spark#spark-cassandra-connector_2.10;1.6.2 from central in [default] | |
com.google.guava#guava;16.0.1 from central in [default] | |
com.twitter#jsr166e;1.1.0 from central in [default] | |
io.dropwizard.metrics#metrics-core;3.1.2 from central in [default] | |
io.netty#netty-all;4.0.33.Final from central in [default] | |
io.netty#netty-buffer;4.0.33.Final from central in [default] | |
io.netty#netty-codec;4.0.33.Final from central in [default] | |
io.netty#netty-common;4.0.33.Final from central in [default] | |
io.netty#netty-handler;4.0.33.Final from central in [default] | |
io.netty#netty-transport;4.0.33.Final from central in [default] | |
joda-time#joda-time;2.3 from central in [default] | |
org.apache.cassandra#cassandra-clientutil;3.0.2 from central in [default] | |
org.apache.commons#commons-lang3;3.3.2 from central in [default] | |
org.joda#joda-convert;1.2 from central in [default] | |
org.scala-lang#scala-reflect;2.10.5 from central in [default] | |
:: evicted modules: | |
com.datastax.spark#spark-cassandra-connector_2.10;1.6.0-M1 by [com.datastax.spark#spark-cassandra-connector_2.10;1.6.2] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 19 | 18 | 18 | 1 || 18 | 18 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
18 artifacts copied, 0 already retrieved (17146kB/29ms) | |
17/01/03 00:02:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
17/01/03 00:02:23 INFO spark.SecurityManager: Changing view acls to: root | |
17/01/03 00:02:23 INFO spark.SecurityManager: Changing modify acls to: root | |
17/01/03 00:02:23 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) | |
17/01/03 00:02:23 INFO spark.HttpServer: Starting HTTP Server | |
17/01/03 00:02:23 INFO server.Server: jetty-8.y.z-SNAPSHOT | |
17/01/03 00:02:23 INFO server.AbstractConnector: Started [email protected]:38184 | |
17/01/03 00:02:23 INFO util.Utils: Successfully started service 'HTTP class server' on port 38184. | |
Welcome to | |
____ __ | |
/ __/__ ___ _____/ /__ | |
_\ \/ _ \/ _ `/ __/ '_/ | |
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0 | |
/_/ | |
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51) | |
Type in expressions to have them evaluated. | |
Type :help for more information. | |
17/01/03 00:02:26 INFO spark.SparkContext: Running Spark version 1.6.0 | |
17/01/03 00:02:26 INFO spark.SecurityManager: Changing view acls to: root | |
17/01/03 00:02:26 INFO spark.SecurityManager: Changing modify acls to: root | |
17/01/03 00:02:26 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) | |
17/01/03 00:02:26 INFO util.Utils: Successfully started service 'sparkDriver' on port 36755. | |
17/01/03 00:02:27 INFO slf4j.Slf4jLogger: Slf4jLogger started | |
17/01/03 00:02:27 INFO Remoting: Starting remoting | |
17/01/03 00:02:27 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:41761] | |
17/01/03 00:02:27 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 41761. | |
17/01/03 00:02:27 INFO spark.SparkEnv: Registering MapOutputTracker | |
17/01/03 00:02:27 INFO spark.SparkEnv: Registering BlockManagerMaster | |
17/01/03 00:02:27 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-1c12ad81-58ce-4784-b3bf-3675039decad | |
17/01/03 00:02:27 INFO storage.MemoryStore: MemoryStore started with capacity 511.5 MB | |
17/01/03 00:02:27 INFO spark.SparkEnv: Registering OutputCommitCoordinator | |
17/01/03 00:02:27 INFO server.Server: jetty-8.y.z-SNAPSHOT | |
17/01/03 00:02:27 INFO server.AbstractConnector: Started [email protected]:4040 | |
17/01/03 00:02:27 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. | |
17/01/03 00:02:27 INFO ui.SparkUI: Started SparkUI at http://172.17.0.2:4040 | |
17/01/03 00:02:27 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-56996647-c055-44bc-b88b-74271a4c3804/httpd-5c321823-db01-44ca-88c2-40b5451f7b1d | |
17/01/03 00:02:27 INFO spark.HttpServer: Starting HTTP Server | |
17/01/03 00:02:27 INFO server.Server: jetty-8.y.z-SNAPSHOT | |
17/01/03 00:02:27 INFO server.AbstractConnector: Started [email protected]:38077 | |
17/01/03 00:02:27 INFO util.Utils: Successfully started service 'HTTP file server' on port 38077. | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/com.datastax.spark_spark-cassandra-connector_2.10-1.6.2.jar at http://172.17.0.2:38077/jars/com.datastax.spark_spark-cassandra-connector_2.10-1.6.2.jar with timestamp 1483419747911 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/TargetHolding_pyspark-cassandra-0.3.5.jar at http://172.17.0.2:38077/jars/TargetHolding_pyspark-cassandra-0.3.5.jar with timestamp 1483419747912 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/joda-time_joda-time-2.3.jar at http://172.17.0.2:38077/jars/joda-time_joda-time-2.3.jar with timestamp 1483419747914 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/com.twitter_jsr166e-1.1.0.jar at http://172.17.0.2:38077/jars/com.twitter_jsr166e-1.1.0.jar with timestamp 1483419747915 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/io.netty_netty-all-4.0.33.Final.jar at http://172.17.0.2:38077/jars/io.netty_netty-all-4.0.33.Final.jar with timestamp 1483419747919 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/org.joda_joda-convert-1.2.jar at http://172.17.0.2:38077/jars/org.joda_joda-convert-1.2.jar with timestamp 1483419747920 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-reflect-2.10.5.jar at http://172.17.0.2:38077/jars/org.scala-lang_scala-reflect-2.10.5.jar with timestamp 1483419747925 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/com.datastax.spark_spark-cassandra-connector-java_2.10-1.6.0-M1.jar at http://172.17.0.2:38077/jars/com.datastax.spark_spark-cassandra-connector-java_2.10-1.6.0-M1.jar with timestamp 1483419747926 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.cassandra_cassandra-clientutil-3.0.2.jar at http://172.17.0.2:38077/jars/org.apache.cassandra_cassandra-clientutil-3.0.2.jar with timestamp 1483419747927 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/com.datastax.cassandra_cassandra-driver-core-3.0.0.jar at http://172.17.0.2:38077/jars/com.datastax.cassandra_cassandra-driver-core-3.0.0.jar with timestamp 1483419747928 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.commons_commons-lang3-3.3.2.jar at http://172.17.0.2:38077/jars/org.apache.commons_commons-lang3-3.3.2.jar with timestamp 1483419747929 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/com.google.guava_guava-16.0.1.jar at http://172.17.0.2:38077/jars/com.google.guava_guava-16.0.1.jar with timestamp 1483419747932 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/io.netty_netty-handler-4.0.33.Final.jar at http://172.17.0.2:38077/jars/io.netty_netty-handler-4.0.33.Final.jar with timestamp 1483419747933 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/io.dropwizard.metrics_metrics-core-3.1.2.jar at http://172.17.0.2:38077/jars/io.dropwizard.metrics_metrics-core-3.1.2.jar with timestamp 1483419747933 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/io.netty_netty-buffer-4.0.33.Final.jar at http://172.17.0.2:38077/jars/io.netty_netty-buffer-4.0.33.Final.jar with timestamp 1483419747934 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/io.netty_netty-transport-4.0.33.Final.jar at http://172.17.0.2:38077/jars/io.netty_netty-transport-4.0.33.Final.jar with timestamp 1483419747935 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/io.netty_netty-codec-4.0.33.Final.jar at http://172.17.0.2:38077/jars/io.netty_netty-codec-4.0.33.Final.jar with timestamp 1483419747935 | |
17/01/03 00:02:27 INFO spark.SparkContext: Added JAR file:/root/.ivy2/jars/io.netty_netty-common-4.0.33.Final.jar at http://172.17.0.2:38077/jars/io.netty_netty-common-4.0.33.Final.jar with timestamp 1483419747936 | |
17/01/03 00:02:28 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 | |
17/01/03 00:02:28 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers | |
17/01/03 00:02:28 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) | |
17/01/03 00:02:28 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead | |
17/01/03 00:02:28 INFO yarn.Client: Setting up container launch context for our AM | |
17/01/03 00:02:28 INFO yarn.Client: Setting up the launch environment for our AM container | |
17/01/03 00:02:28 INFO yarn.Client: Preparing resources for our AM container | |
17/01/03 00:02:28 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs:/spark/spark-assembly-1.6.0-hadoop2.6.0.jar | |
17/01/03 00:02:28 INFO yarn.Client: Uploading resource file:/tmp/spark-56996647-c055-44bc-b88b-74271a4c3804/__spark_conf__708893425202883644.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1483419504930_0003/__spark_conf__708893425202883644.zip | |
17/01/03 00:02:29 INFO spark.SecurityManager: Changing view acls to: root | |
17/01/03 00:02:29 INFO spark.SecurityManager: Changing modify acls to: root | |
17/01/03 00:02:29 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) | |
17/01/03 00:02:29 INFO yarn.Client: Submitting application 3 to ResourceManager | |
17/01/03 00:02:29 INFO impl.YarnClientImpl: Submitted application application_1483419504930_0003 | |
17/01/03 00:02:30 INFO yarn.Client: Application report for application_1483419504930_0003 (state: ACCEPTED) | |
17/01/03 00:02:30 INFO yarn.Client: | |
client token: N/A | |
diagnostics: N/A | |
ApplicationMaster host: N/A | |
ApplicationMaster RPC port: -1 | |
queue: default | |
start time: 1483419749645 | |
final status: UNDEFINED | |
tracking URL: http://sandbox:8088/proxy/application_1483419504930_0003/ | |
user: root | |
17/01/03 00:02:31 INFO yarn.Client: Application report for application_1483419504930_0003 (state: ACCEPTED) | |
17/01/03 00:02:32 INFO yarn.Client: Application report for application_1483419504930_0003 (state: ACCEPTED) | |
17/01/03 00:02:33 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null) | |
17/01/03 00:02:33 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sandbox, PROXY_URI_BASES -> http://sandbox:8088/proxy/application_1483419504930_0003), /proxy/application_1483419504930_0003 | |
17/01/03 00:02:33 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter | |
17/01/03 00:02:33 INFO yarn.Client: Application report for application_1483419504930_0003 (state: RUNNING) | |
17/01/03 00:02:33 INFO yarn.Client: | |
client token: N/A | |
diagnostics: N/A | |
ApplicationMaster host: 172.17.0.2 | |
ApplicationMaster RPC port: 0 | |
queue: default | |
start time: 1483419749645 | |
final status: UNDEFINED | |
tracking URL: http://sandbox:8088/proxy/application_1483419504930_0003/ | |
user: root | |
17/01/03 00:02:33 INFO cluster.YarnClientSchedulerBackend: Application application_1483419504930_0003 has started running. | |
17/01/03 00:02:33 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37175. | |
17/01/03 00:02:33 INFO netty.NettyBlockTransferService: Server created on 37175 | |
17/01/03 00:02:33 INFO storage.BlockManagerMaster: Trying to register BlockManager | |
17/01/03 00:02:33 INFO storage.BlockManagerMasterEndpoint: Registering block manager 172.17.0.2:37175 with 511.5 MB RAM, BlockManagerId(driver, 172.17.0.2, 37175) | |
17/01/03 00:02:33 INFO storage.BlockManagerMaster: Registered BlockManager | |
17/01/03 00:02:36 INFO cluster.YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (sandbox:48414) with ID 1 | |
17/01/03 00:02:36 INFO storage.BlockManagerMasterEndpoint: Registering block manager sandbox:45935 with 511.5 MB RAM, BlockManagerId(1, sandbox, 45935) | |
17/01/03 00:02:37 INFO cluster.YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (sandbox:48418) with ID 2 | |
17/01/03 00:02:37 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 | |
17/01/03 00:02:37 INFO storage.BlockManagerMasterEndpoint: Registering block manager sandbox:37143 with 511.5 MB RAM, BlockManagerId(2, sandbox, 37143) | |
17/01/03 00:02:37 INFO repl.SparkILoop: Created spark context.. | |
Spark context available as sc. | |
17/01/03 00:02:39 INFO hive.HiveContext: Initializing execution hive, version 1.2.1 | |
17/01/03 00:02:39 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0 | |
17/01/03 00:02:39 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0 | |
17/01/03 00:02:39 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore | |
17/01/03 00:02:39 INFO metastore.ObjectStore: ObjectStore, initialize called | |
17/01/03 00:02:39 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/spark/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar." | |
17/01/03 00:02:39 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/spark/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar." | |
17/01/03 00:02:39 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/spark/lib/datanucleus-core-3.2.10.jar." | |
17/01/03 00:02:39 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored | |
17/01/03 00:02:39 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored | |
17/01/03 00:02:46 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" | |
17/01/03 00:02:47 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:02:47 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:02:51 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:02:51 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:02:53 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY | |
17/01/03 00:02:53 INFO metastore.ObjectStore: Initialized ObjectStore | |
17/01/03 00:02:53 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 | |
17/01/03 00:02:53 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException | |
17/01/03 00:02:53 INFO metastore.HiveMetaStore: Added admin role in metastore | |
17/01/03 00:02:53 INFO metastore.HiveMetaStore: Added public role in metastore | |
17/01/03 00:02:54 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty | |
17/01/03 00:02:54 INFO metastore.HiveMetaStore: 0: get_all_databases | |
17/01/03 00:02:54 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_all_databases | |
17/01/03 00:02:54 INFO metastore.HiveMetaStore: 0: get_functions: db=default pat=* | |
17/01/03 00:02:54 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_functions: db=default pat=* | |
17/01/03 00:02:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:02:55 INFO session.SessionState: Created local directory: /tmp/7a957a13-7181-4077-a380-bcaeae515de9_resources | |
17/01/03 00:02:55 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/7a957a13-7181-4077-a380-bcaeae515de9 | |
17/01/03 00:02:55 INFO session.SessionState: Created local directory: /tmp/root/7a957a13-7181-4077-a380-bcaeae515de9 | |
17/01/03 00:02:55 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/7a957a13-7181-4077-a380-bcaeae515de9/_tmp_space.db | |
17/01/03 00:02:55 INFO hive.HiveContext: default warehouse location is /user/hive/warehouse | |
17/01/03 00:02:55 INFO hive.HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. | |
17/01/03 00:02:55 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0 | |
17/01/03 00:02:55 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0 | |
17/01/03 00:02:55 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore | |
17/01/03 00:02:56 INFO metastore.ObjectStore: ObjectStore, initialize called | |
17/01/03 00:02:56 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/spark/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar." | |
17/01/03 00:02:56 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/spark/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar." | |
17/01/03 00:02:56 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/spark/lib/datanucleus-core-3.2.10.jar." | |
17/01/03 00:02:56 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored | |
17/01/03 00:02:56 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored | |
17/01/03 00:02:59 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" | |
17/01/03 00:03:00 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:03:00 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:03:01 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:03:01 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:03:02 INFO DataNucleus.Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing | |
17/01/03 00:03:02 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY | |
17/01/03 00:03:02 INFO metastore.ObjectStore: Initialized ObjectStore | |
17/01/03 00:03:02 INFO metastore.HiveMetaStore: Added admin role in metastore | |
17/01/03 00:03:02 INFO metastore.HiveMetaStore: Added public role in metastore | |
17/01/03 00:03:02 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty | |
17/01/03 00:03:02 INFO metastore.HiveMetaStore: 0: get_all_databases | |
17/01/03 00:03:02 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_all_databases | |
17/01/03 00:03:02 INFO metastore.HiveMetaStore: 0: get_functions: db=default pat=* | |
17/01/03 00:03:02 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_functions: db=default pat=* | |
17/01/03 00:03:02 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table. | |
17/01/03 00:03:02 INFO session.SessionState: Created local directory: /tmp/85ae933e-836d-45c4-a840-f241a5a665f9_resources | |
17/01/03 00:03:02 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/85ae933e-836d-45c4-a840-f241a5a665f9 | |
17/01/03 00:03:02 INFO session.SessionState: Created local directory: /tmp/root/85ae933e-836d-45c4-a840-f241a5a665f9 | |
17/01/03 00:03:02 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/85ae933e-836d-45c4-a840-f241a5a665f9/_tmp_space.db | |
17/01/03 00:03:02 INFO repl.SparkILoop: Created sql context (with Hive support).. | |
SQL context available as sqlContext. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment