Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26588

Idle executor should properly be killed when no job is submitted

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 2.4.0
    • None
    • Scheduler, Spark Core
    • None

    Description

      I enable dynamic allocation feature with spark-shell and do not submit any task. After spark.dynamicAllocation.executorIdleTimeout seconds(default 60s), there is still one active executor, which is abnormal. All idle executors are timeout and should be removed.(default spark.dynamicAllocation.minExecutors=0). The spark-shell command show below:

      spark-shell --master=yarn --conf spark.ui.port=8040 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.maxExecutors=8 --conf spark.dynamicAllocation.initialExecutors=4 --conf spark.shuffle.service.enabled=true
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              wuqingxinnn Qingxin Wu
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: