Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26957

Add config properties to configure the default scheduler pool priorities

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: In Progress
    • Minor
    • Resolution: Unresolved
    • 3.1.0
    • None
    • Scheduler, Spark Core
    • None

    Description

      Currently, it is possible to dynamically create new scheduler pools in Spark just by setting spark.scheduler.pool. to a new value.

      We use this capability to create separate pools for different projects that run jobs in the same long-lived driver application. Each project gets its own pool, and within the pool jobs are executed in a FIFO manner.

      This setup works well, except that we also have a low priority queue for background tasks. We would prefer for all of the dynamic pools to have a higher priority than this background queue.
      We can accomplish this by hardcoding the project queue names in a spark_pools.xml config file and setting their priority to 100.

      Unfortunately, there is no way to set the priority for dynamically created pools.  They are all hardcoded to 1.  It would be nice if there were configuration settings to change this.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              DaveDeCaprio Dave DeCaprio
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated: