Details
Description
After installation, when I ran .\bin\spark-shell --master yarn, YarnClient will report Error in running commands like the followings:
%JAVA_HOME%/bin/java -server -cp %CLASSPATH%;C:\hdp\spark-1.1.1\lib\spark-assembly-1.1.1-hadoop2.4.0.jar -Xmx512m -Djava.io.tmpdir=%PWD%/tmp '-Dspark.tachyonStore.folderName=spark-919783cd-bdf7-4e6b-86bf-011244e4a49f' '-Dspark.yarn.secondary.jars=' '-Dspark.repl.class.uri=http://192.168.0.13:12972' '-Dspark.driver.host=HOME-HYPERVS' '-Dspark.driver.appUIHistoryAddress=' '-Dspark.app.name=Spark shell' '-Dspark.driver.appUIAddress=HOME-HYPERVS:4040' '-Dspark.jars=' '-Dspark.fileserver.uri=http://192.168.0.13:12992' '-Dspark.master=yarn-client' '-Dspark.driver.port=12988' org.apache.spark.deploy.yarn.ExecutorLauncher --class 'notused' --jar null --arg 'HOME-HYPERVS:12988' --executor-memory 1024 --executor-cores 1 --num-executors 2
It will run error because of single quote instead of double quote. The following file will need to be modified in File YarnSparkHadoopUtil.scala:
def escapeForShell(arg: String): String = {
if (arg != null) {
val escaped = new StringBuilder("'")
for (i <- 0 to arg.length() - 1) {
arg.charAt match
}
escaped.append("'").toString()
} else
}
After modification from single quote to doulbe quote, the command is OK.
Attachments
Issue Links
- duplicates
-
SPARK-5754 Spark AM not launching on Windows
- Resolved