Created
October 22, 2014 04:33
-
-
Save dpnova/47807434529514aa35ac to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
hduser@dpn-linux-desktop:/usr/local/hadoop/spark-1.1.0-bin-hadoop2.4$ ./sbin/start-thriftserver.sh | |
14/10/22 14:27:27 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive | |
14/10/22 14:27:27 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize | |
14/10/22 14:27:27 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize | |
14/10/22 14:27:27 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack | |
14/10/22 14:27:27 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node | |
14/10/22 14:27:27 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces | |
14/10/22 14:27:27 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative | |
14/10/22 14:27:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
14/10/22 14:27:29 INFO thriftserver.HiveThriftServer2: Starting SparkContext | |
14/10/22 14:27:29 WARN util.Utils: Your hostname, dpn-linux-desktop resolves to a loopback address: 127.0.0.1; using 10.1.1.10 instead (on interface eth0) | |
14/10/22 14:27:29 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address | |
14/10/22 14:27:29 INFO spark.SecurityManager: Changing view acls to: hduser, | |
14/10/22 14:27:29 INFO spark.SecurityManager: Changing modify acls to: hduser, | |
14/10/22 14:27:29 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hduser, ); users with modify permissions: Set(hduser, ) | |
14/10/22 14:27:30 INFO slf4j.Slf4jLogger: Slf4jLogger started | |
14/10/22 14:27:30 INFO Remoting: Starting remoting | |
14/10/22 14:27:30 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:53261] | |
14/10/22 14:27:30 INFO Remoting: Remoting now listens on addresses: [akka.tcp://[email protected]:53261] | |
14/10/22 14:27:30 INFO util.Utils: Successfully started service 'sparkDriver' on port 53261. | |
14/10/22 14:27:30 INFO spark.SparkEnv: Registering MapOutputTracker | |
14/10/22 14:27:30 INFO spark.SparkEnv: Registering BlockManagerMaster | |
14/10/22 14:27:30 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20141022142730-b262 | |
14/10/22 14:27:30 INFO util.Utils: Successfully started service 'Connection manager for block manager' on port 45341. | |
14/10/22 14:27:30 INFO network.ConnectionManager: Bound socket to port 45341 with id = ConnectionManagerId(dpn-linux-desktop.local,45341) | |
14/10/22 14:27:30 INFO storage.MemoryStore: MemoryStore started with capacity 265.4 MB | |
14/10/22 14:27:30 INFO storage.BlockManagerMaster: Trying to register BlockManager | |
14/10/22 14:27:30 INFO storage.BlockManagerMasterActor: Registering block manager dpn-linux-desktop.local:45341 with 265.4 MB RAM | |
14/10/22 14:27:30 INFO storage.BlockManagerMaster: Registered BlockManager | |
14/10/22 14:27:30 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-80813413-cc2e-4547-aa11-6008c9fa4296 | |
14/10/22 14:27:30 INFO spark.HttpServer: Starting HTTP Server | |
14/10/22 14:27:30 INFO server.Server: jetty-8.y.z-SNAPSHOT | |
14/10/22 14:27:31 INFO server.AbstractConnector: Started [email protected]:58372 | |
14/10/22 14:27:31 INFO util.Utils: Successfully started service 'HTTP file server' on port 58372. | |
14/10/22 14:27:36 INFO server.Server: jetty-8.y.z-SNAPSHOT | |
14/10/22 14:27:36 INFO server.AbstractConnector: Started [email protected]:4040 | |
14/10/22 14:27:36 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. | |
14/10/22 14:27:36 INFO ui.SparkUI: Started SparkUI at http://dpn-linux-desktop.local:4040 | |
14/10/22 14:27:36 INFO util.AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://[email protected]:53261/user/HeartbeatReceiver | |
14/10/22 14:27:37 INFO service.AbstractService: HiveServer2: Async execution pool size 50 | |
14/10/22 14:27:37 INFO service.AbstractService: Service:OperationManager is inited. | |
14/10/22 14:27:37 INFO service.AbstractService: Service: SessionManager is inited. | |
14/10/22 14:27:37 INFO service.AbstractService: Service: CLIService is inited. | |
14/10/22 14:27:37 INFO service.AbstractService: Service:ThriftBinaryCLIService is inited. | |
14/10/22 14:27:37 INFO service.AbstractService: Service: HiveServer2 is inited. | |
14/10/22 14:27:37 INFO service.AbstractService: Service:OperationManager is started. | |
14/10/22 14:27:37 INFO service.AbstractService: Service:SessionManager is started. | |
14/10/22 14:27:37 INFO service.AbstractService: Service:CLIService is started. | |
14/10/22 14:27:37 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore | |
14/10/22 14:27:37 INFO metastore.ObjectStore: ObjectStore, initialize called | |
14/10/22 14:27:37 ERROR service.CompositeService: Error starting services HiveServer2 | |
org.apache.hive.service.ServiceException: Unable to connect to MetaStore! | |
at org.apache.hive.service.cli.CLIService.start(CLIService.java:85) | |
at org.apache.hive.service.CompositeService.start(CompositeService.java:70) | |
at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:73) | |
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:71) | |
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: javax.jdo.JDOFatalUserException: Class org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found. | |
NestedThrowables: | |
java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory | |
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1175) | |
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) | |
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304) | |
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234) | |
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209) | |
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) | |
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) | |
at org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64) | |
at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060) | |
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121) | |
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:104) | |
at org.apache.hive.service.cli.CLIService.start(CLIService.java:82) | |
... 11 more | |
Caused by: java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:425) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:358) | |
at java.lang.Class.forName0(Native Method) | |
at java.lang.Class.forName(Class.java:270) | |
at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018) | |
at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.jdo.JDOHelper.forName(JDOHelper.java:2015) | |
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162) | |
... 32 more | |
14/10/22 14:27:37 INFO service.AbstractService: Service:OperationManager is stopped. | |
14/10/22 14:27:37 INFO service.AbstractService: Service:SessionManager is stopped. | |
14/10/22 14:27:37 INFO service.AbstractService: Service:CLIService is stopped. | |
14/10/22 14:27:37 ERROR thriftserver.HiveThriftServer2: Error starting HiveThriftServer2 | |
org.apache.hive.service.ServiceException: Failed to Start HiveServer2 | |
at org.apache.hive.service.CompositeService.start(CompositeService.java:80) | |
at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:73) | |
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:71) | |
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: org.apache.hive.service.ServiceException: Unable to connect to MetaStore! | |
at org.apache.hive.service.cli.CLIService.start(CLIService.java:85) | |
at org.apache.hive.service.CompositeService.start(CompositeService.java:70) | |
... 10 more | |
Caused by: javax.jdo.JDOFatalUserException: Class org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found. | |
NestedThrowables: | |
java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory | |
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1175) | |
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) | |
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304) | |
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234) | |
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209) | |
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) | |
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) | |
at org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64) | |
at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060) | |
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121) | |
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:104) | |
at org.apache.hive.service.cli.CLIService.start(CLIService.java:82) | |
... 11 more | |
Caused by: java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:425) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:358) | |
at java.lang.Class.forName0(Native Method) | |
at java.lang.Class.forName(Class.java:270) | |
at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018) | |
at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.jdo.JDOHelper.forName(JDOHelper.java:2015) | |
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162) | |
... 32 more | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/metrics/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null} | |
14/10/22 14:27:37 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null} | |
14/10/22 14:27:37 INFO ui.SparkUI: Stopped Spark web UI at http://dpn-linux-desktop.local:4040 | |
14/10/22 14:27:37 INFO scheduler.DAGScheduler: Stopping DAGScheduler | |
14/10/22 14:27:39 INFO spark.MapOutputTrackerMasterActor: MapOutputTrackerActor stopped! | |
14/10/22 14:27:39 INFO network.ConnectionManager: Selector thread was interrupted! | |
14/10/22 14:27:39 INFO network.ConnectionManager: ConnectionManager stopped | |
14/10/22 14:27:39 INFO storage.MemoryStore: MemoryStore cleared | |
14/10/22 14:27:39 INFO storage.BlockManager: BlockManager stopped | |
14/10/22 14:27:39 INFO storage.BlockManagerMaster: BlockManagerMaster stopped | |
14/10/22 14:27:39 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. | |
14/10/22 14:27:39 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. | |
14/10/22 14:27:39 INFO spark.SparkContext: Successfully stopped SparkContext |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment