Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33201

Mesos dispatcher service is not working due to empty --pyFiles conf in cluster mode which is the default

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 3.0.0, 3.0.1
    • None
    • Mesos
    • None

    Description

      In MesosCluster mode, all of the spark jobs fail to run because by default "--py-files" is set to an empty string which causes spark-submit to use wrong jar name. This issue is caused by SPARK-26466

       For ex:

      --total-executor-cores
      2
      --py-files
      --conf
      spark.driver.maxResultSize=15g

       so the very next --conf is used as the value for --py-files so `spark.driver.maxResultSize=15g` is used as jar name 

      which causes error as below:

      20/10/19 20:19:18 WARN DependencyUtils: Local jar {dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-0000/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g does not exist, skipping.
      20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load org.apache.spark.examples.SparkPi.
      java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi

      In order to reproduce the bug run any spark provided examples on mesos in cluster mode:

       ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master mesos://<server>:7077 --deploy-mode cluster --conf spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 <home>/examples/jars/spark-examples_2.12-3.0.1.jar 100
      

      Attachments

        Activity

          People

            Unassigned Unassigned
            amandeep.kaur Amandeep
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: