Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26369

How to limit Spark concurrent tasks number in one job?

    XMLWordPrintableJSON

Details

    • Question
    • Status: Closed
    • Major
    • Resolution: Invalid
    • 2.1.0, 2.2.0, 2.3.2, 2.4.0
    • None
    • Scheduler, Spark Core
    • None

    Description

      Hi All,
      it is possible make fair scheduler pools pluggable? so that we can
      implement our own SchedulingAlgorithm. In our case, we want to limit the
      max tasks number of one job which will load data from mysql database, if we
      set a bigger executer.number * cores.number, it will trigger alarm. Or we
      can do this in an other way?

      Attachments

        Activity

          People

            Unassigned Unassigned
            fchen Fu Chen
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: