Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-27104

Spark Fair scheduler across applications in standalone mode

    XMLWordPrintableJSON

Details

    • Wish
    • Status: Resolved
    • Minor
    • Resolution: Incomplete
    • 2.2.3, 2.3.3, 2.4.0
    • None
    • Scheduler, Spark Core

    Description

      Spark in standalone mode currently only supports FIFO (first-in-first-out) scheduler across applications.

      It will be great that a fair scheduler is supported. A fair scheduler across applications, not in a application.

       

      Use case (for example with the integration of zeppelin)

      At certain moment, user A submits a heavy application and consumes all the resources of the spark cluster.

      At a later moment, user B submits a second application.

      No matter how many work nodes you added now, all the resources go to user A due to the FIFO. User B will never get any resource until user A release its allocated resources.

       

      A fair scheduler should distribute extra resources in a fair way on all running applications, which demands resources.

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            happyhua Hua Zhang
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: