Details
-
Improvement
-
Status: Open
-
Major
-
Resolution: Unresolved
-
3.5.0
-
None
Description
Respect the user defined SPARK_LOCAL_DIRS container env when setup local dirs.
For example, we use hostPath for spark local dir.
But we do not mount the sub disks directly to the pod, we mount a root path for spark driver/executor pod.
For example, the root path is `/hadoop`.
And there are sub disks under that, likes `hadoop/1, /hadoop/2, /hadoop/3, /hadoop4`.
And we want to define the SPARK_LOCAL_DIRS in the driver/executor pod env.
But now, the user specified SPARK_LOCAL_DIRS does not work.
Attachments
Issue Links
- links to