Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
2.3.2, 2.4.0
-
None
Description
When I run SparkSubmitSuite of spark2.3.2 in intellij IDE, I found that some tests cannot pass when I run them one by one, but they passed when the whole SparkSubmitSuite was run.
Failed tests when ran seperately:
test("SPARK_CONF_DIR overrides spark-defaults.conf") { forConfDir(Map("spark.executor.memory" -> "2.3g")) { path => val unusedJar = TestUtils.createJarWithClasses(Seq.empty) val args = Seq( "--class", SimpleApplicationTest.getClass.getName.stripSuffix("$"), "--name", "testApp", "--master", "local", unusedJar.toString) val appArgs = new SparkSubmitArguments(args, Map("SPARK_CONF_DIR" -> path)) assert(appArgs.defaultPropertiesFile != null) assert(appArgs.defaultPropertiesFile.startsWith(path)) assert(appArgs.propertiesFile == null) appArgs.executorMemory should be ("2.3g") } }
Failure reason:
Error: Executor Memory cores must be a positive number Run with --help for usage help or --verbose for debug output
After carefully checked the code, I found the exitFn of SparkSubmit is overrided by front tests in testPrematrueExit call.
Although the above test was fixed by SPARK-22941, but the overriden of exitFn might cause other problems in the future.