Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18646

ExecutorClassLoader for spark-shell does not honor spark.executor.userClassPathFirst

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.6.2
    • 2.3.0
    • Spark Shell
    • None

    Description

      When submitting a spark-shell application, the executor side classloader is set to be ExecutorClassLoader.
      However, it appears that when ExecutorClassLoader is used, parameter spark.executor.userClassPathFirst is not honored.
      It turns out that, since ExecutorClassLoader class is defined as

      class ExecutorClassLoader(conf: SparkConf, classUri: String, parent: ClassLoader,
          userClassPathFirst: Boolean) extends ClassLoader with Logging
      

      its parent classloader is actually the system default classloader (due to ClassLoader class's default constructor) rather than the "parent" classloader specified in ExecutorClassLoader's constructor.
      As a result, when spark.executor.userClassPathFirst is set to true, even though the "parent" classloader is ChildFirstURLClassLoader, ExecutorClassLoader.getParent() will return the system default classloader.
      Thus, when ExecutorClassLoader tries to load a class, it will first attempt to load it through the system default classloader, and this will break the spark.executor.userClassPathFirst behavior.

      A simple fix would be to define ExecutorClassLoader as:

      class ExecutorClassLoader(conf: SparkConf, classUri: String, parent: ClassLoader,
          userClassPathFirst: Boolean) extends ClassLoader(parent) with Logging
      

      Attachments

        Issue Links

          Activity

            People

              mshen Min Shen
              mshen Min Shen
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: