Created
June 14, 2018 06:26
-
-
Save adam-phillipps/43d9c3e7d1caa8af1500ab56c8eb2bdf to your computer and use it in GitHub Desktop.
spark-submit config fail
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| $ python bin/smash_planner.py | |
| Getting self-IP | |
| No metadata, found 'hostname -i': 172.17.0.2 | |
| Using ip: 172.17.0.2 | |
| Modifying configuration | |
| Creating the /usr/spark-2.3.0/conf/spark-env.sh file... | |
| Adding to spark-env.sh: | |
| export SPARK_LOCAL_IP=172.17.0.2 | |
| export PYSPARK_PYTHON=python | |
| Creating the /spark/conf/spark-defaults.conf file... | |
| Adding to spark-defaults.conf: | |
| spark.master spark://spark-master:7077 | |
| spark.driver.host spark-master | |
| /etc/hosts edited contents: | |
| 127.0.0.1 localhost | |
| ::1 localhost ip6-localhost ip6-loopback | |
| fe00::0 ip6-localnet | |
| ff00::0 ip6-mcastprefix | |
| ff02::1 ip6-allnodes | |
| ff02::2 ip6-allrouters | |
| 172.17.0.2 cdee95838ee0 | |
| 172.17.0.2 spark-master | |
| Configuration changes successful | |
| Running the Spark command for build, using updated configs | |
| Created Spark command: su ubuntu -c '/usr/spark-2.3.0/bin/spark-submit --master spark://spark-master:7077 --name spark-master /usr/local/smash_planner/build_phase.py' | |
| Spark Command: /usr/jdk1.8.0_131/bin/java -cp /usr/spark-2.3.0/conf/:/usr/spark-2.3.0/jars/*:/usr/hadoop-2.8.3/etc/hadoop/:/usr/hadoop-2.8.3/etc/hadoop/*:/usr/hadoop-2.8.3/share/hadoop/common/lib/*:/usr/hadoop-2.8.3/share/hadoop/common/*:/usr/hadoop-2.8.3/share/hadoop/hdfs/*:/usr/hadoop-2.8.3/share/hadoop/hdfs/lib/*:/usr/hadoop-2.8.3/share/hadoop/yarn/lib/*:/usr/hadoop-2.8.3/share/hadoop/yarn/*:/usr/hadoop-2.8.3/share/hadoop/mapreduce/lib/*:/usr/hadoop-2.8.3/share/hadoop/mapreduce/*:/usr/hadoop-2.8.3/share/hadoop/tools/lib/* -Xmx1g org.apache.spark.deploy.SparkSubmit --master spark://spark-master:7077 --name spark-master /usr/local/smash_planner/build_phase.py | |
| ======================================== | |
| 2018-06-14 06:22:43 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
| Running build_phase... | |
| Building data with pred_date of: 2018-06-14 06:22:44.403131 | |
| Creating and saving keyword data to S3... | |
| About to run get_dataframe | |
| Getting Spark Context... | |
| 2018-06-14 06:22:44 INFO SparkContext:54 - Running Spark version 2.3.0 | |
| 2018-06-14 06:22:44 INFO SparkContext:54 - Submitted application: SmashPlanner | |
| 2018-06-14 06:22:44 INFO SecurityManager:54 - Changing view acls to: ubuntu | |
| 2018-06-14 06:22:44 INFO SecurityManager:54 - Changing modify acls to: ubuntu | |
| 2018-06-14 06:22:44 INFO SecurityManager:54 - Changing view acls groups to: | |
| 2018-06-14 06:22:44 INFO SecurityManager:54 - Changing modify acls groups to: | |
| 2018-06-14 06:22:44 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ubuntu); groups with view permissions: Set(); users with modify permissions: Set(ubuntu); groups with modify permissions: Set() | |
| 2018-06-14 06:22:44 INFO Utils:54 - Successfully started service 'sparkDriver' on port 41895. | |
| 2018-06-14 06:22:44 INFO SparkEnv:54 - Registering MapOutputTracker | |
| 2018-06-14 06:22:45 INFO SparkEnv:54 - Registering BlockManagerMaster | |
| 2018-06-14 06:22:45 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
| 2018-06-14 06:22:45 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up | |
| 2018-06-14 06:22:45 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-111bcb92-0433-4efb-a5ae-6ab7bc2c8d5d | |
| 2018-06-14 06:22:45 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB | |
| 2018-06-14 06:22:45 INFO SparkEnv:54 - Registering OutputCommitCoordinator | |
| 2018-06-14 06:22:45 INFO log:192 - Logging initialized @3791ms | |
| 2018-06-14 06:22:45 INFO Server:346 - jetty-9.3.z-SNAPSHOT | |
| 2018-06-14 06:22:45 INFO Server:414 - Started @3930ms | |
| 2018-06-14 06:22:45 INFO AbstractConnector:278 - Started ServerConnector@76431fcc{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} | |
| 2018-06-14 06:22:45 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040. | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@668e8e8d{/jobs,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3ad47d23{/jobs/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@528f39f5{/jobs/job,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@50314c50{/jobs/job/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@49150e10{/stages,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@627ec656{/stages/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@578be9ac{/stages/stage,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@78deb15f{/stages/stage/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7090e161{/stages/pool,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6eea6fd6{/stages/pool/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3603ad0a{/storage,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3b5ab6dc{/storage/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4c36c33f{/storage/rdd,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@711f91b0{/storage/rdd/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a5f0964{/environment,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@345db8a5{/environment/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4e08ba2a{/executors,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4317204a{/executors/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@61a11565{/executors/threadDump,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38600164{/executors/threadDump/json,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@333225a8{/static,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@651e47ac{/,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@cd128b1{/api,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@423dc58b{/jobs/job/kill,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@72cc60ba{/stages/stage/kill,null,AVAILABLE,@Spark} | |
| 2018-06-14 06:22:45 INFO SparkUI:54 - Bound SparkUI to 172.17.0.2, and started at http://spark-master:4040 | |
| 2018-06-14 06:22:46 INFO SparkContext:54 - Added file file:/usr/local/smash_planner/build_phase.py at spark://spark-master:41895/files/build_phase.py with timestamp 1528957366122 | |
| 2018-06-14 06:22:46 INFO Utils:54 - Copying /usr/local/smash_planner/build_phase.py to /tmp/spark-91133b2d-545a-4f96-863b-83b6f8f13cbd/userFiles-55b13e7c-9766-4f52-9f29-2072b7d9a481/build_phase.py | |
| 2018-06-14 06:22:46 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://spark-master:7077... | |
| 2018-06-14 06:22:46 WARN StandaloneAppClient$ClientEndpoint:87 - Failed to connect to master spark-master:7077 | |
| org.apache.spark.SparkException: Exception thrown in awaitResult: | |
| at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) | |
| at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) | |
| at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) | |
| at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
| at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.io.IOException: Failed to connect to spark-master/172.17.0.2:7077 | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) | |
| at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190) | |
| ... 4 more | |
| Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: spark-master/172.17.0.2:7077 | |
| at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
| at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
| at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323) | |
| at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) | |
| at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) | |
| at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) | |
| at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) | |
| ... 1 more | |
| Caused by: java.net.ConnectException: Connection refused | |
| ... 11 more | |
| 2018-06-14 06:23:06 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://spark-master:7077... | |
| 2018-06-14 06:23:06 WARN StandaloneAppClient$ClientEndpoint:87 - Failed to connect to master spark-master:7077 | |
| org.apache.spark.SparkException: Exception thrown in awaitResult: | |
| at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) | |
| at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) | |
| at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) | |
| at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
| at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.io.IOException: Failed to connect to spark-master/172.17.0.2:7077 | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) | |
| at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190) | |
| ... 4 more | |
| Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: spark-master/172.17.0.2:7077 | |
| at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
| at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
| at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323) | |
| at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) | |
| at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) | |
| at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) | |
| at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) | |
| ... 1 more | |
| Caused by: java.net.ConnectException: Connection refused | |
| ... 11 more | |
| 2018-06-14 06:23:26 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://spark-master:7077... | |
| 2018-06-14 06:23:26 WARN StandaloneAppClient$ClientEndpoint:87 - Failed to connect to master spark-master:7077 | |
| org.apache.spark.SparkException: Exception thrown in awaitResult: | |
| at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) | |
| at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) | |
| at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) | |
| at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
| at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.io.IOException: Failed to connect to spark-master/172.17.0.2:7077 | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) | |
| at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190) | |
| ... 4 more | |
| Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: spark-master/172.17.0.2:7077 | |
| at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
| at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
| at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323) | |
| at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) | |
| at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) | |
| at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) | |
| at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) | |
| ... 1 more | |
| Caused by: java.net.ConnectException: Connection refused | |
| ... 11 more | |
| 2018-06-14 06:23:46 ERROR StandaloneSchedulerBackend:70 - Application has been killed. Reason: All masters are unresponsive! Giving up. | |
| 2018-06-14 06:23:46 WARN StandaloneSchedulerBackend:66 - Application ID is not initialized yet. | |
| 2018-06-14 06:23:46 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46113. | |
| 2018-06-14 06:23:46 INFO AbstractConnector:318 - Stopped Spark@76431fcc{HTTP/1.1,[http/1.1]}{172.17.0.2:4040} | |
| 2018-06-14 06:23:46 INFO NettyBlockTransferService:54 - Server created on spark-master:46113 | |
| 2018-06-14 06:23:46 INFO SparkUI:54 - Stopped Spark web UI at http://spark-master:4040 | |
| 2018-06-14 06:23:46 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
| 2018-06-14 06:23:46 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, spark-master, 46113, None) | |
| 2018-06-14 06:23:46 INFO StandaloneSchedulerBackend:54 - Shutting down all executors | |
| 2018-06-14 06:23:46 INFO BlockManagerMasterEndpoint:54 - Registering block manager spark-master:46113 with 366.3 MB RAM, BlockManagerId(driver, spark-master, 46113, None) | |
| 2018-06-14 06:23:46 INFO CoarseGrainedSchedulerBackend$DriverEndpoint:54 - Asking each executor to shut down | |
| 2018-06-14 06:23:46 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, spark-master, 46113, None) | |
| 2018-06-14 06:23:46 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, spark-master, 46113, None) | |
| 2018-06-14 06:23:46 WARN StandaloneAppClient$ClientEndpoint:66 - Drop UnregisterApplication(null) because has not yet connected to master | |
| 2018-06-14 06:23:46 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! | |
| 2018-06-14 06:23:46 INFO MemoryStore:54 - MemoryStore cleared | |
| 2018-06-14 06:23:46 INFO BlockManager:54 - BlockManager stopped | |
| 2018-06-14 06:23:46 INFO BlockManagerMaster:54 - BlockManagerMaster stopped | |
| 2018-06-14 06:23:46 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! | |
| 2018-06-14 06:23:46 INFO SparkContext:54 - Successfully stopped SparkContext | |
| 2018-06-14 06:23:46 ERROR SparkContext:91 - Error initializing SparkContext. | |
| java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem | |
| at scala.Predef$.require(Predef.scala:224) | |
| at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) | |
| at org.apache.spark.SparkContext.<init>(SparkContext.scala:515) | |
| at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) | |
| at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) | |
| at java.lang.reflect.Constructor.newInstance(Constructor.java:423) | |
| at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) | |
| at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) | |
| at py4j.Gateway.invoke(Gateway.java:238) | |
| at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) | |
| at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) | |
| at py4j.GatewayConnection.run(GatewayConnection.java:214) | |
| at java.lang.Thread.run(Thread.java:748) | |
| 2018-06-14 06:23:46 INFO SparkContext:54 - SparkContext already stopped. | |
| Traceback (most recent call last): | |
| File "/usr/local/smash_planner/build_phase.py", line 13, in <module> | |
| main() | |
| File "/usr/local/smash_planner/build_phase.py", line 9, in main | |
| build_all_data(pred_date) | |
| File "/usr/local/smash_planner/DataPiping/build_data.py", line 25, in build_all_data | |
| save_keyword(pred_date) | |
| File "/usr/local/smash_planner/DataPiping/build_data.py", line 52, in save_keyword | |
| df = get_dataframe(query) | |
| File "/usr/local/smash_planner/SparkUtil/data_piping.py", line 15, in get_dataframe | |
| sc = SparkCtx.get_sparkCtx() | |
| File "/usr/local/smash_planner/SparkUtil/context.py", line 20, in get_sparkCtx | |
| sc = SparkContext(conf=conf).getOrCreate() | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/context.py", line 118, in __init__ | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/context.py", line 180, in _do_init | |
| File "/usr/spark-2.3.0/python/lib/pyspark.zip/pyspark/context.py", line 270, in _initialize_context | |
| File "/usr/local/lib/python3.4/dist-packages/py4j-0.10.6-py3.4.egg/py4j/java_gateway.py", line 1428, in __call__ | |
| answer, self._gateway_client, None, self._fqn) | |
| File "/usr/local/lib/python3.4/dist-packages/py4j-0.10.6-py3.4.egg/py4j/protocol.py", line 320, in get_return_value | |
| format(target_id, ".", name), value) | |
| py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. | |
| : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem | |
| at scala.Predef$.require(Predef.scala:224) | |
| at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) | |
| at org.apache.spark.SparkContext.<init>(SparkContext.scala:515) | |
| at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) | |
| at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) | |
| at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) | |
| at java.lang.reflect.Constructor.newInstance(Constructor.java:423) | |
| at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) | |
| at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) | |
| at py4j.Gateway.invoke(Gateway.java:238) | |
| at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) | |
| at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) | |
| at py4j.GatewayConnection.run(GatewayConnection.java:214) | |
| at java.lang.Thread.run(Thread.java:748) | |
| 2018-06-14 06:23:46 INFO ShutdownHookManager:54 - Shutdown hook called | |
| 2018-06-14 06:23:46 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-74be0e23-01a2-4a1e-83b4-15d8515c4544 | |
| 2018-06-14 06:23:46 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-91133b2d-545a-4f96-863b-83b6f8f13cbd | |
| Traceback (most recent call last): | |
| File "bin/smash_planner.py", line 81, in <module> | |
| raise RuntimeError("Spark hated your config and/or invocation...") | |
| RuntimeError: Spark hated your config and/or invocation... |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment