Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-27750

Standalone scheduler - ability to prioritize applications over drivers, many drivers act like Denial of Service

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Closed
    • Minor
    • Resolution: Not A Problem
    • 3.0.0
    • None
    • Scheduler, Spark Core
    • None

    Description

      If I submit 1000 spark submit drivers then they consume all the cores on my cluster (essentially it acts like a Denial of Service) and no spark 'application' gets to run since the cores are all consumed by the 'drivers'. This feature is about having the ability to prioritize applications over drivers so that at least some 'applications' can start running. I guess it would be like: If (driver.state = 'submitted' and (exists some app.state = 'submitted')) then set app.state = 'running'

      if all apps have app.state = 'running' then set driver.state = 'submitted' 

       

      Secondary to this, why must a driver consume a minimum of 1 entire core?

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              toopt4 t oo
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: