Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-47556

[K8] Spark App ID collision resulting in deleting wrong resources

    XMLWordPrintableJSON

Details

    Description

      Issue:

      We noticed that sometimes K8s executor pods go in a crash loop. Reason being 'Error: MountVolume.SetUp failed for volume "spark-conf-volume-exec"'. Upon investigation we noticed that there are 2 spark jobs that launched with same application id and when one of them finishes first it deletes all it's resources and deletes the resources of other job too.

      -> Spark application ID is created using this code
      "spark-application-" + System.currentTimeMillis
      This means if 2 applications launch at the same milli second they could end up having same AppId

      ->  spark-app-selector label is added to all resource created by driver and it's value is application Id. Kubernetes Scheduler deletes all the apps with same label upon termination.

      This results in deletion of config map and executor pods of job that's still running, driver tries to relaunch the executor pods, but config map is not present, so it's in crash loop

      Context

      We are using Spark of Kubernetes and launch our spark jobs using PySpark. We launch multiple Spark Jobs within a given k8s namespace. Each Spark job can be launched from different pods or from different processes in a pod. Every time a job is launched it has a unique app name. Here is how the job is launched (omitting irrelevant details):

      # spark_conf has settings required for spark on k8s 
      sp = SparkSession.builder \
          .config(conf=spark_conf) \
          .appName('testapp')
      sp.master(f'k8s://{kubernetes_host}')
      session = sp.getOrCreate()
      with session:
          session.sql('SELECT 1')

      Repro

      Set same app id in spark config, run 2 different jobs, one that finishes fast, one that runs slow. Slower job goes into crash loop

      "spark.app.id": "<same Id for 2 spark job>"

      Workaround

      Set unique spark.app.id for all the jobs that run on k8s

      eg:

      "spark.app.id": f'{AppName}-{CurrTimeInMilliSecs}-{UUId}'[:63]

      Fix

      Add unique hash add the end of Application ID: https://github.com/apache/spark/pull/45712 

       

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              sundeepk Sundeep K
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: