Last active
October 30, 2020 02:43
-
-
Save yaravind/d99474e0e42f0baf1d023591a9ddcaef to your computer and use it in GitHub Desktop.
Submit apps (SparkPi as e.g.) to spark cluster using rest api
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
curl -X POST -d http://master-host:6066/v1/submissions/create --header "Content-Type:application/json" --data '{ | |
"action": "CreateSubmissionRequest", | |
"appResource": "hdfs://localhost:9000/user/spark-examples_2.11-2.0.0.jar", | |
"clientSparkVersion": "2.0.0", | |
"appArgs": [ "10" ], | |
"environmentVariables" : { | |
"SPARK_ENV_LOADED" : "1" | |
}, | |
"mainClass": "org.apache.spark.examples.SparkPi", | |
"sparkProperties": { | |
"spark.jars": "hdfs://localhost:9000/user/spark-examples_2.11-2.0.0.jar", | |
"spark.driver.supervise":"false", | |
"spark.executor.memory": "512m", | |
"spark.driver.memory": "512m", | |
"spark.submit.deployMode":"cluster", | |
"spark.app.name": "SparkPi", | |
"spark.master": "spark://master-host:6066" | |
} | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@Hammad-Raza
Hi, I also need to pass an argument to my spark-job. And this is how I've solved it:
Then inside my job which is a python file:
The output is:
sys.argv=> ['Path/to/my/python/file', 'arg1']
You can also see all the spark config by the following code
The output is:
[('spark.app.name', 'myfile.py'), ('spark.driver.cores', '4'), ('spark.driver.extraClassPath', '/jars/mysql-connector-java-8.0.11.jar'), ('spark.driver.supervise', 'true'), ('spark.eventLog.dir', 'file:/tmp/spark-events'), ('spark.eventLog.enabled', 'true'), ('spark.executor.extraClassPath', '/jars/mysql-connector-java-8.0.11.jar'), ('spark.executorEnv.JAVA_HOME', '/java-1.8.0-openjdk-1.8.0.171-8.b10.el7_5.x86_64'), ('spark.files', 'file:Path/to/my/python/file'), ('spark.master', 'spark://master-url:7077'), ('spark.submit.deployMode', 'client'), ('spark.ui.enabled', 'true')]
Hope this help