Created
June 14, 2018 06:37
-
-
Save adam-phillipps/c6536cae956b669934548a4e43abde93 to your computer and use it in GitHub Desktop.
spark example fail
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| root@cdee95838ee0:/usr/spark-2.3.0# spark-submit examples/src/main/python/kmeans.py data/mllib/kmeans_data.txt 10 5 | |
| Spark Command: /usr/jdk1.8.0_131/bin/java -cp /usr/spark-2.3.0/conf/:/usr/spark-2.3.0/jars/*:/usr/hadoop-2.8.3/etc/hadoop/:/usr/hadoop-2.8.3/etc/hadoop/*:/usr/hadoop-2.8.3/share/hadoop/common/lib/*:/usr/hadoop-2.8.3/share/hadoop/common/*:/usr/hadoop-2.8.3/share/hadoop/hdfs/*:/usr/hadoop-2.8.3/share/hadoop/hdfs/lib/*:/usr/hadoop-2.8.3/share/hadoop/yarn/lib/*:/usr/hadoop-2.8.3/share/hadoop/yarn/*:/usr/hadoop-2.8.3/share/hadoop/mapreduce/lib/*:/usr/hadoop-2.8.3/share/hadoop/mapreduce/*:/usr/hadoop-2.8.3/share/hadoop/tools/lib/* -Xmx1g org.apache.spark.deploy.SparkSubmit examples/src/main/python/kmeans.py data/mllib/kmeans_data.txt 10 5 | |
| ======================================== | |
| 2018-06-14 06:35:29 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
| WARN: This is a naive implementation of KMeans Clustering and is given | |
| as an example! Please refer to examples/src/main/python/ml/kmeans_example.py for an | |
| example on how to use ML's KMeans implementation. | |
| 2018-06-14 06:35:30 INFO SparkContext:54 - Running Spark version 2.3.0 | |
| 2018-06-14 06:35:30 INFO SparkContext:54 - Submitted application: PythonKMeans | |
| 2018-06-14 06:35:30 INFO SecurityManager:54 - Changing view acls to: root | |
| 2018-06-14 06:35:30 INFO SecurityManager:54 - Changing modify acls to: root | |
| 2018-06-14 06:35:30 INFO SecurityManager:54 - Changing view acls groups to: | |
| 2018-06-14 06:35:30 INFO SecurityManager:54 - Changing modify acls groups to: | |
| 2018-06-14 06:35:30 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() | |
| 2018-06-14 06:35:30 INFO Utils:54 - Successfully started service 'sparkDriver' on port 40605. | |
| 2018-06-14 06:35:30 INFO SparkEnv:54 - Registering MapOutputTracker | |
| 2018-06-14 06:35:30 INFO SparkEnv:54 - Registering BlockManagerMaster | |
| 2018-06-14 06:35:30 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
| 2018-06-14 06:35:30 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up | |
| 2018-06-14 06:35:30 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-3ac74180-fc0a-4b04-8e86-6825b5f38833 | |
| 2018-06-14 06:35:30 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB | |
| 2018-06-14 06:35:30 INFO SparkEnv:54 - Registering OutputCommitCoordinator | |
| 2018-06-14 06:35:30 INFO log:192 - Logging initialized @2874ms | |
| 2018-06-14 06:35:30 INFO Server:346 - jetty-9.3.z-SNAPSHOT | |
| 2018-06-14 06:35:30 INFO Server:414 - Started @3039ms | |
| 2018-06-14 06:35:31 INFO AbstractConnector:278 - Started ServerConnector@76aa1bd1{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} | |
| 2018-06-14 06:35:31 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040. | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@421fe915{/jobs,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4850a21e{/jobs/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54e5850c{/jobs/job,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@75a96751{/jobs/job/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38a46ba8{/stages,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@40915455{/stages/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2ab7819c{/stages/stage,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4501e632{/stages/stage/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@53ac346f{/stages/pool,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@10b6af26{/stages/pool/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5ec39e4c{/storage,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@14b9c3d0{/storage/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@26849264{/storage/rdd,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@36ed60d{/storage/rdd/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@419d09d2{/environment,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1b60e572{/environment/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@19f543c8{/executors,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6cde3045{/executors/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@45a47795{/executors/threadDump,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@59f35126{/executors/threadDump/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6a2ba2d3{/static,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@8db5e7{/,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4452082d{/api,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@50eb1fb{/jobs/job/kill,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7b63643f{/stages/stage/kill,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:35:31 INFO SparkUI:54 - Bound SparkUI to 172.17.0.2, and started at http://spark-master:4040 | |
| 2018-06-14 06:35:31 INFO SparkContext:54 - Added file file:/usr/spark-2.3.0/examples/src/main/python/kmeans.py at spark://spark-master:40605/files/kmeans.py with timestamp 1528958131695 | |
| 2018-06-14 06:35:31 INFO Utils:54 - Copying /usr/spark-2.3.0/examples/src/main/python/kmeans.py to /tmp/spark-d7741082-4374-40eb-8785-da96f19df160/userFiles-fa451e49-739b-4dcf-bc0e-6c1a40325970/kmeans.py | |
| 2018-06-14 06:35:31 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://spark-master:7077... | |
| 2018-06-14 06:35:31 WARN StandaloneAppClient$ClientEndpoint:87 - Failed to connect to master spark-master:7077 | |
| org.apache.spark.SparkException: Exception thrown in awaitResult: | |
| at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) | |
| at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) | |
| at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) | |
| at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
| at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.io.IOException: Failed to connect to spark-master/172.17.0.2:7077 | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) | |
| at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190) | |
| ... 4 more | |
| Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: spark-master/172.17.0.2:7077 | |
| at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
| at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
| at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323) | |
| at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) | |
| at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) | |
| at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) | |
| at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) | |
| ... 1 more | |
| Caused by: java.net.ConnectException: Connection refused | |
| ... 11 more | |
| 2018-06-14 06:35:51 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://spark-master:7077... | |
| 2018-06-14 06:35:51 WARN StandaloneAppClient$ClientEndpoint:87 - Failed to connect to master spark-master:7077 | |
| org.apache.spark.SparkException: Exception thrown in awaitResult: | |
| at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) | |
| at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) | |
| at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) | |
| at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
| at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.io.IOException: Failed to connect to spark-master/172.17.0.2:7077 | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) | |
| at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190) | |
| ... 4 more | |
| Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: spark-master/172.17.0.2:7077 | |
| at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
| at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
| at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323) | |
| at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) | |
| at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) | |
| at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) | |
| at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) | |
| ... 1 more | |
| Caused by: java.net.ConnectException: Connection refused | |
| ... 11 more | |
| 2018-06-14 06:36:11 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://spark-master:7077... | |
| 2018-06-14 06:36:11 WARN StandaloneAppClient$ClientEndpoint:87 - Failed to connect to master spark-master:7077 | |
| org.apache.spark.SparkException: Exception thrown in awaitResult: | |
| at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) | |
| at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) | |
| at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) | |
| at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
| at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.io.IOException: Failed to connect to spark-master/172.17.0.2:7077 | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) | |
| at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190) | |
| ... 4 more | |
| Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: spark-master/172.17.0.2:7077 | |
| at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
| at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
| at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323) | |
| at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) | |
| at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) | |
| at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) | |
| at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) | |
| ... 1 more | |
| Caused by: java.net.ConnectException: Connection refused | |
| ... 11 more | |
| 2018-06-14 06:36:31 ERROR StandaloneSchedulerBackend:70 - Application has been killed. Reason: All masters are unresponsive! Giving up. | |
| 2018-06-14 06:36:31 WARN StandaloneSchedulerBackend:66 - Application ID is not initialized yet. | |
| 2018-06-14 06:36:31 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39199. | |
| 2018-06-14 06:36:31 INFO NettyBlockTransferService:54 - Server created on spark-master:39199 | |
| 2018-06-14 06:36:31 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
| 2018-06-14 06:36:31 INFO AbstractConnector:318 - Stopped Spark@76aa1bd1{HTTP/1.1,[http/1.1]}{172.17.0.2:4040} | |
| 2018-06-14 06:36:31 INFO SparkUI:54 - Stopped Spark web UI at http://spark-master:4040 | |
| 2018-06-14 06:36:31 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, spark-master, 39199, None) | |
| 2018-06-14 06:36:31 INFO BlockManagerMasterEndpoint:54 - Registering block manager spark-master:39199 with 366.3 MB RAM, BlockManagerId(driver, spark-master, 39199, None) | |
| 2018-06-14 06:36:31 INFO StandaloneSchedulerBackend:54 - Shutting down all executors | |
| 2018-06-14 06:36:31 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, spark-master, 39199, None) | |
| 2018-06-14 06:36:31 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, spark-master, 39199, None) | |
| 2018-06-14 06:36:31 INFO CoarseGrainedSchedulerBackend$DriverEndpoint:54 - Asking each executor to shut down | |
| 2018-06-14 06:36:31 WARN StandaloneAppClient$ClientEndpoint:66 - Drop UnregisterApplication(null) because has not yet connected to master | |
| 2018-06-14 06:36:31 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! | |
| 2018-06-14 06:36:31 INFO MemoryStore:54 - MemoryStore cleared | |
| 2018-06-14 06:36:31 INFO BlockManager:54 - BlockManager stopped | |
| 2018-06-14 06:36:31 INFO BlockManagerMaster:54 - BlockManagerMaster stopped | |
| 2018-06-14 06:36:31 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! | |
| 2018-06-14 06:36:31 INFO SparkContext:54 - Successfully stopped SparkContext | |
| 2018-06-14 06:36:32 ERROR SparkContext:91 - Error initializing SparkContext. | |
| java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem | |
| at scala.Predef$.require(Predef.scala:224) | |
| at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) | |
| at org.apache.spark.SparkContext.<init>(SparkContext.scala:515) | |
| at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) | |
| at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) | |
| at java.lang.reflect.Constructor.newInstance(Constructor.java:423) | |
| at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) | |
| at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) | |
| at py4j.Gateway.invoke(Gateway.java:238) | |
| at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) | |
| at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) | |
| at py4j.GatewayConnection.run(GatewayConnection.java:214) | |
| at java.lang.Thread.run(Thread.java:748) | |
| 2018-06-14 06:36:32 INFO SparkContext:54 - SparkContext already stopped. | |
| Traceback (most recent call last): | |
| File "/usr/spark-2.3.0/examples/src/main/python/kmeans.py", line 61, in <module> | |
| .getOrCreate() | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/sql/session.py", line 173, in getOrCreate | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/context.py", line 331, in getOrCreate | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/context.py", line 118, in __init__ | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/context.py", line 180, in _do_init | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/context.py", line 270, in _initialize_context | |
| File "/usr/local/lib/python3.4/dist-packages/py4j-0.10.6-py3.4.egg/py4j/java_gateway.py", line 1428, in __call__ | |
| answer, self._gateway_client, None, self._fqn) | |
| File "/usr/local/lib/python3.4/dist-packages/py4j-0.10.6-py3.4.egg/py4j/protocol.py", line 320, in get_return_value | |
| format(target_id, ".", name), value) | |
| py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. | |
| : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem | |
| at scala.Predef$.require(Predef.scala:224) | |
| at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) | |
| at org.apache.spark.SparkContext.<init>(SparkContext.scala:515) | |
| at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) | |
| at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) | |
| at java.lang.reflect.Constructor.newInstance(Constructor.java:423) | |
| at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) | |
| at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) | |
| at py4j.Gateway.invoke(Gateway.java:238) | |
| at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) | |
| at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) | |
| at py4j.GatewayConnection.run(GatewayConnection.java:214) | |
| at java.lang.Thread.run(Thread.java:748) | |
| 2018-06-14 06:36:32 INFO ShutdownHookManager:54 - Shutdown hook called | |
| 2018-06-14 06:36:32 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-d7741082-4374-40eb-8785-da96f19df160 | |
| 2018-06-14 06:36:32 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-0c8d4f20-326f-46bc-b69e-f586b13571cb | |
| root@cdee95838ee0:/usr/spark-2.3.0# |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment