Created
June 9, 2018 22:28
-
-
Save adam-phillipps/8ffd4b3a9273277da7c066d5be73af1b to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Getting self-IP | |
| No metadata, using 'hostname -i': 172.17.0.2 | |
| Created Spark command: su ubuntu -c '/usr/spark-2.3.0/bin/spark-submit --master spark://spark-master:7077 /usr/local/smash_planner/build_phase.py' | |
| Modifying configuration | |
| Creating the /usr/spark-2.3.0/conf/spark-env.sh file... | |
| Adding to spark-env.sh: | |
| export SPARK_LOCAL_IP=172.17.0.2 | |
| export PYSPARK_PYTHON=python | |
| export SPARK_MASTER_IP=172.17.0.2 | |
| export SPARK_MASTER_OPTS=-Dspark.driver.port=7001,-Dspark.fileserver.port=7002,-Dspark.broadcast.port=7003,-Dspark.replClassServer.port=7004,-Dspark.blockManager.port=7005,-Dspark.executor.port=7006,-Dspark.ui.port=4040,-Dspark.broadcast.factory=org.apache.spark.broadcast.HttpBroadcastFactory | |
| export SPARK_WORKER_OPTS=-Dspark.driver.port=7001,-Dspark.fileserver.port=7002,-Dspark.broadcast.port=7003,-Dspark.replClassServer.port=7004,-Dspark.blockManager.port=7005,-Dspark.executor.port=7006,-Dspark.ui.port=4040,-Dspark.broadcast.factory=org.apache.spark.broadcast.HttpBroadcastFactory | |
| export SPARK_JAVA_OPTS=-Dspark.driver.port=7001,-Dspark.fileserver.port=7002,-Dspark.broadcast.port=7003,-Dspark.replClassServer.port=7004,-Dspark.blockManager.port=7005,-Dspark.executor.port=7006,-Dspark.ui.port=4040,-Dspark.broadcast.factory=org.apache.spark.broadcast.HttpBroadcastFactory | |
| export SPARK_MASTER_HOST=spark-master | |
| Creating the /spark/conf/spark-defaults.conf file... | |
| Adding to spark-defaults.conf: | |
| spark.master spark://spark-master:7077 | |
| spark.hadoop.fs.s3n.impl org.apache.hadoop.fs.s3native.NativeS3FileSystem | |
| spark.driver.host spark-master | |
| /etc/hosts edited contents: | |
| 127.0.0.1 localhost | |
| ::1 localhost ip6-localhost ip6-loopback | |
| fe00::0 ip6-localnet | |
| ff00::0 ip6-mcastprefix | |
| ff02::1 ip6-allnodes | |
| ff02::2 ip6-allrouters | |
| 172.17.0.2 0415cc42662b | |
| 172.17.0.2 spark-master | |
| Configuration changes successful | |
| Running the Spark command for build, using updated configs | |
| Spark Command: /usr/jdk1.8.0_131/bin/java -cp /usr/spark-2.3.0/conf/:/usr/spark-2.3.0/jars/*:/usr/hadoop-2.8.3/etc/hadoop/:/usr/hadoop-2.8.3/etc/hadoop/*:/usr/hadoop-2.8.3/share/hadoop/common/lib/*:/usr/hadoop-2.8.3/share/hadoop/common/*:/usr/hadoop-2.8.3/share/hadoop/hdfs/*:/usr/hadoop-2.8.3/share/hadoop/hdfs/lib/*:/usr/hadoop-2.8.3/share/hadoop/yarn/lib/*:/usr/hadoop-2.8.3/share/hadoop/yarn/*:/usr/hadoop-2.8.3/share/hadoop/mapreduce/lib/*:/usr/hadoop-2.8.3/share/hadoop/mapreduce/*:/usr/hadoop-2.8.3/share/hadoop/tools/lib/* -Xmx1g org.apache.spark.deploy.SparkSubmit --master spark://spark-master:7077 /usr/local/smash_planner/build_phase.py | |
| ======================================== | |
| 2018-06-09 22:27:18 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
| Running build_phase... | |
| Building data with pred_date of: 2018-06-09 22:27:18.978301 | |
| Creating and saving keyword data to S3... | |
| About to run get_dataframe | |
| Getting Spark Context... | |
| 2018-06-09 22:27:19 INFO SparkContext:54 - Running Spark version 2.3.0 | |
| 2018-06-09 22:27:19 INFO SparkContext:54 - Submitted application: SmashPlanner | |
| 2018-06-09 22:27:19 INFO SecurityManager:54 - Changing view acls to: ubuntu | |
| 2018-06-09 22:27:19 INFO SecurityManager:54 - Changing modify acls to: ubuntu | |
| 2018-06-09 22:27:19 INFO SecurityManager:54 - Changing view acls groups to: | |
| 2018-06-09 22:27:19 INFO SecurityManager:54 - Changing modify acls groups to: | |
| 2018-06-09 22:27:19 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ubuntu); groups with view permissions: Set(); users with modify permissions: Set(ubuntu); groups with modify permissions: Set() | |
| 2018-06-09 22:27:19 INFO Utils:54 - Successfully started service 'sparkDriver' on port 39465. | |
| 2018-06-09 22:27:19 INFO SparkEnv:54 - Registering MapOutputTracker | |
| 2018-06-09 22:27:19 INFO SparkEnv:54 - Registering BlockManagerMaster | |
| 2018-06-09 22:27:19 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
| 2018-06-09 22:27:19 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up | |
| 2018-06-09 22:27:19 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-89689f93-eb72-46e4-b365-424159fe4f85 | |
| 2018-06-09 22:27:19 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB | |
| 2018-06-09 22:27:19 INFO SparkEnv:54 - Registering OutputCommitCoordinator | |
| 2018-06-09 22:27:19 INFO log:192 - Logging initialized @3665ms | |
| 2018-06-09 22:27:19 INFO Server:346 - jetty-9.3.z-SNAPSHOT | |
| 2018-06-09 22:27:20 INFO Server:414 - Started @3787ms | |
| 2018-06-09 22:27:20 INFO AbstractConnector:278 - Started ServerConnector@76aa1bd1{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} | |
| 2018-06-09 22:27:20 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040. | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@421fe915{/jobs,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4850a21e{/jobs/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54e5850c{/jobs/job,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@75a96751{/jobs/job/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38a46ba8{/stages,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@40915455{/stages/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2ab7819c{/stages/stage,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4501e632{/stages/stage/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@53ac346f{/stages/pool,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@10b6af26{/stages/pool/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5ec39e4c{/storage,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@14b9c3d0{/storage/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@26849264{/storage/rdd,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@36ed60d{/storage/rdd/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@419d09d2{/environment,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1b60e572{/environment/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@19f543c8{/executors,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6cde3045{/executors/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@45a47795{/executors/threadDump,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@59f35126{/executors/threadDump/json,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6a2ba2d3{/static,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@8db5e7{/,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4452082d{/api,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@50eb1fb{/jobs/job/kill,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7b63643f{/stages/stage/kill,null,AVAILABLE,@Spark} | |
| 2018-06-09 22:27:20 INFO SparkUI:54 - Bound SparkUI to 172.17.0.2, and started at http://spark-master:4040 | |
| 2018-06-09 22:27:20 INFO SparkContext:54 - Added file file:/usr/local/smash_planner/build_phase.py at spark://spark-master:39465/files/build_phase.py with timestamp 1528583240700 | |
| 2018-06-09 22:27:20 INFO Utils:54 - Copying /usr/local/smash_planner/build_phase.py to /tmp/spark-02516e1f-711d-4644-a14e-68df1eccb8c6/userFiles-1e640ae3-5989-4779-973c-b5c5682b768d/build_phase.py | |
| 2018-06-09 22:27:20 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://spark-master:7077... | |
| 2018-06-09 22:27:20 WARN StandaloneAppClient$ClientEndpoint:87 - Failed to connect to master spark-master:7077 | |
| org.apache.spark.SparkException: Exception thrown in awaitResult: | |
| at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) | |
| at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) | |
| at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) | |
| at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) | |
| at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
| at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.io.IOException: Failed to connect to spark-master/172.17.0.2:7077 | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) | |
| at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) | |
| at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194) | |
| at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190) | |
| ... 4 more | |
| Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: spark-master/172.17.0.2:7077 | |
| at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
| at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
| at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323) | |
| at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) | |
| at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) | |
| at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) | |
| at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) | |
| at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) | |
| ... 1 more | |
| Caused by: java.net.ConnectException: Connection refused | |
| ... 11 more |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment