Details
-
Story
-
Status: Resolved
-
Minor
-
Resolution: Won't Do
-
2.4.0
-
None
-
None
Description
When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases.
val nothing = inputData.barrier().mapPartitions
{_ =>
val barrierTask = BarrierTaskContext.get()
// save data to disk barrierTask.barrier()
barrierTask.barrier()
// launch external process, eg MPI Task + TensorFlow
}
The problem is that external process remains running when spark task is killed manually. This Jira is the place to talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt.