Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-31666

Cannot map hostPath volumes to container

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Not A Problem
    • 2.4.5
    • None
    • Kubernetes, Spark Core
    • None

    Description

      I'm trying to mount additional hostPath directories as seen in a couple of places:

      https://aws.amazon.com/blogs/containers/optimizing-spark-performance-on-kubernetes/

      https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/user-guide.md#using-volume-for-scratch-space

      https://spark.apache.org/docs/latest/running-on-kubernetes.html#using-kubernetes-volumes

       

      However, whenever I try to submit my job, I run into this error:

      Uncaught exception in thread kubernetes-executor-snapshots-subscribers-1 │
       io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: POST at: https://kubernetes.default.svc/api/v1/namespaces/my-spark-ns/pods. Message: Pod "spark-pi-1588970477877-exec-1" is invalid: spec.containers[0].volumeMounts[1].mountPath: Invalid value: "/tmp1": must be unique. Received status: Status(apiVersion=v1, code=422, details=StatusDetails(causes=[StatusCause(field=spec.containers[0].volumeMounts[1].mountPath, message=Invalid value: "/tmp1": must be unique, reason=FieldValueInvalid, additionalProperties={})], group=null, kind=Pod, name=spark-pi-1588970477877-exec-1, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=Pod "spark-pi-1588970477877-exec-1" is invalid: spec.containers[0].volumeMounts[1].mountPath: Invalid value: "/tmp1": must be unique, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=Invalid, status=Failure, additionalProperties={}).

       

      This is my spark-submit command (note: I've used my own build of spark for kubernetes as well as a few other images that I've seen floating around (such as this one seedjeffwan/spark:v2.4.5) and they all have this same issue):

      bin/spark-submit \
       --master k8s://https://my-k8s-server:443 \
       --deploy-mode cluster \
       --name spark-pi \
       --class org.apache.spark.examples.SparkPi \
       --conf spark.executor.instances=2 \
       --conf spark.kubernetes.container.image=my-spark-image:my-tag \
       --conf spark.kubernetes.driver.pod.name=sparkpi-test-driver \
       --conf spark.kubernetes.namespace=my-spark-ns \
       --conf spark.kubernetes.executor.volumes.hostPath.spark-local-dir-2.mount.path=/tmp1 \
       --conf spark.kubernetes.executor.volumes.hostPath.spark-local-dir-2.options.path=/tmp1 \
       --conf spark.local.dir="/tmp1" \
       --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark
       local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar 20000

      Any ideas on what's causing this?

       

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              hopper-signifyd Stephen Hopper
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: