Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
1.0.0
-
None
-
None
-
Windows 2008 R2
HortonWorks Hadoop cluster 2.4
Spark 1.0
Description
Windows based Yarn cluster failed to execute submitted Spark job.
Can be reproduced by submitting JavaSparkPi on Yarn cluster.
C:\hdp\spark-1.0.0-bin-hadoop2\bin>spark-submit --class org.apache.spark.examples.JavaSparkPi ./../lib/spark-examples-1.0.0-hadoop2.2.0.jar --master yarn-cluster --deploy-mode cluster
The origin of the problems is org.apache.spark.deploy.yarn.ExecutorRunnableUtil.scala. JavaOpts parameters that contain environment variable need to be quoted.
- '%' should be escaped like "kill %%p" . Still doesn't have any sense on Windows. Ideally should be passed through "spark.executor.extraJavaOptions".
- javaOpts += "-Djava.io.tmpdir=\"" + new Path(Environment.PWD.$(), YarnConfiguration.DEFAULT_CONTAINER_TEMP_DIR) + "\""
Attachments
Issue Links
- duplicates
-
SPARK-5754 Spark AM not launching on Windows
- Resolved