Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-46091

[KUBERNETES] Respect the existing kubernetes container SPARK_LOCAL_DIRS env

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.5.0
    • None
    • Kubernetes

    Description

      Respect the user defined SPARK_LOCAL_DIRS container env when setup local dirs.

       

      For example, we use hostPath for spark local dir.

      But we do not mount the sub disks directly to the pod, we mount a root path for spark driver/executor pod.

       

      For example, the root path is `/hadoop`.

       

      And there are sub disks under that, likes `hadoop/1, /hadoop/2, /hadoop/3, /hadoop4`.

       

      And we want to define the SPARK_LOCAL_DIRS in the driver/executor pod env.

       

      But now, the user specified SPARK_LOCAL_DIRS does not work.

       

       

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              feiwang Fei Wang
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated: