Skip to content

Instantly share code, notes, and snippets.

@dapangmao
Created March 2, 2015 02:52
Show Gist options
  • Save dapangmao/bc8d0df63b78bf7b2291 to your computer and use it in GitHub Desktop.
Save dapangmao/bc8d0df63b78bf7b2291 to your computer and use it in GitHub Desktop.
root@bigcat:~# ./spark_ec2.py -k sas -i sas.pem --zone=us-east-1a launch test-cluster
Setting up security groups...
Searching for existing cluster test-cluster...
Spark AMI: ami-5bb18832
Launching instances...
Launched 1 slaves in us-east-1a, regid = r-259db6c8
Launched master in us-east-1a, regid = r-3e9db6d3
Waiting for AWS to propagate instance metadata...
Waiting for cluster to enter 'ssh-ready' state..........
Warning: SSH connection error. (This could be temporary.)
Host: 52.1.241.17
SSH return code: 255
SSH output: ssh: connect to host 52.1.241.17 port 22: Connection refused
.
Warning: SSH connection error. (This could be temporary.)
Host: 52.1.241.17
SSH return code: 255
SSH output: ssh: connect to host 52.1.241.17 port 22: Connection refused
.
Cluster is now in 'ssh-ready' state. Waited 336 seconds.
Generating cluster's SSH key on master...
Warning: Permanently added 'ec2-52-1-241-17.compute-1.amazonaws.com,52.1.241.17' (ECDSA) to the list of known hosts.
Connection to ec2-52-1-241-17.compute-1.amazonaws.com closed.
Warning: Permanently added 'ec2-52-1-241-17.compute-1.amazonaws.com,52.1.241.17' (ECDSA) to the list of known hosts.
Transferring cluster's SSH key to slaves...
ec2-54-88-14-92.compute-1.amazonaws.com
Warning: Permanently added 'ec2-54-88-14-92.compute-1.amazonaws.com,54.88.14.92' (ECDSA) to the list of known hosts.
Warning: Permanently added 'ec2-52-1-241-17.compute-1.amazonaws.com,52.1.241.17' (ECDSA) to the list of known hosts.
Cloning into 'spark-ec2'...
remote: Counting objects: 1768, done.
remote: Total 1768 (delta 0), reused 0 (delta 0), pack-reused 1768
Receiving objects: 100% (1768/1768), 291.26 KiB, done.
Resolving deltas: 100% (637/637), done.
Connection to ec2-52-1-241-17.compute-1.amazonaws.com closed.
Deploying files to master...
Warning: Permanently added 'ec2-52-1-241-17.compute-1.amazonaws.com,52.1.241.17' (ECDSA) to the list of known hosts.
sending incremental file list
sent 32 bytes received 11 bytes 86.00 bytes/sec
total size is 0 speedup is 0.00
Running setup on master...
Warning: Permanently added 'ec2-52-1-241-17.compute-1.amazonaws.com,52.1.241.17' (ECDSA) to the list of known hosts.
Connection to ec2-52-1-241-17.compute-1.amazonaws.com closed.
Warning: Permanently added 'ec2-52-1-241-17.compute-1.amazonaws.com,52.1.241.17' (ECDSA) to the list of known hosts.
spark-ec2/setup.sh: line 20: ec2-variables.sh: No such file or directory
Setting up Spark on ip-172-31-29-75.ec2.internal...
Setting executable permissions on scripts...
RSYNC'ing /root/spark-ec2 to other cluster nodes...
[timing] rsync /root/spark-ec2: 00h 00m 00s
Running setup-slave on all cluster nodes to mount filesystems, etc...
Traceback (most recent call last):
File "/usr/bin/pssh", line 118, in <module>
do_pssh(hosts, cmdline, opts)
File "/usr/bin/pssh", line 93, in do_pssh
if min(statuses) < 0:
ValueError: min() arg is an empty sequence
[timing] setup-slave: 00h 00m 00s
Initializing scala
Unpacking Scala
--2015-03-02 02:36:55-- http://s3.amazonaws.com/spark-related-packages/scala-2.9.3.tgz
Resolving s3.amazonaws.com (s3.amazonaws.com)... 54.231.0.224
Connecting to s3.amazonaws.com (s3.amazonaws.com)|54.231.0.224|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 24699008 (24M) [application/x-compressed]
Saving to: ‘scala-2.9.3.tgz’
100%[===========================================================================================>] 24,699,008 3.32MB/s in 7.5s
2015-03-02 02:37:03 (3.15 MB/s) - ‘scala-2.9.3.tgz’ saved [24699008/24699008]
[timing] scala init: 00h 00m 08s
Creating local config files...
ssh: Could not resolve hostname cat: No address associated with hostname
Traceback (most recent call last):
File "./deploy_templates.py", line 28, in <module>
slave_ram_kb = int(os.popen(slave_mem_command).read().strip())
ValueError: invalid literal for int() with base 10: ''
Deploying Spark config files...
chmod: cannot access `/root/spark/conf/spark-env.sh': No such file or directory
File or directory /root/spark/conf doesn't exist!
Setting up scala
RSYNC'ing /root/scala to slaves...
[timing] scala setup: 00h 00m 00s
Connection to ec2-52-1-241-17.compute-1.amazonaws.com closed.
Spark standalone cluster started at http://ec2-52-1-241-17.compute-1.amazonaws.com:8080
Ganglia started at http://ec2-52-1-241-17.compute-1.amazonaws.com:5080/ganglia
Done!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment