Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26886

Proper termination of external processes launched by the worker

    XMLWordPrintableJSON

Details

    • Story
    • Status: Resolved
    • Minor
    • Resolution: Won't Do
    • 2.4.0
    • None
    • Spark Core
    • None

    Description

      When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases. 

      val nothing = inputData.barrier().mapPartitions
      {_ =>
      val barrierTask = BarrierTaskContext.get()
      // save data to disk barrierTask.barrier()
      barrierTask.barrier()
      // launch external process, eg MPI Task + TensorFlow
      }

       
      The problem is that external process remains running when spark task is killed manually. This Jira is the place to talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt.

      Attachments

        Activity

          People

            Unassigned Unassigned
            luzengxiang luzengxiang
            Xiangrui Meng Xiangrui Meng
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: