Last active
August 29, 2015 14:20
apache spark custom installation using sbt and current scala version
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#---------------- custom install Apache Spark -------------------# | |
# instructions to build Apache Spark from source w/ current Scala & using sbt (vs maven) | |
# download Apache Spark src from the appropriate mirror | |
# untar: | |
tar zxf spark-1.3.1.tgz | |
# start sbt REPL | |
sbt | |
# set config to build w/ scala 2.11.6 | |
-Dscala-2.11=true | |
# build the exxecutable jar file w/ all dependencies: | |
assembly | |
#---------------- build the documentation (html) -------------------# | |
# install these dependencies: | |
sudp pip3 install -U pygments --upgrade | |
sudo pip3 install -U sphinx --upgrade | |
# install ruby if not already (aside from your system ruby, which, you should leave alone) | |
brew install ruby | |
# and these ruby gems: | |
sudo gem install jekyll | |
sudo gem install jekyll-redirect-from | |
# cd into /docs from top level spark dir | |
cd /docs | |
PRODUCTION=1 jekyll build |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment