Details
-
Test
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.0.0
-
None
Description
Currently, if we run the Kubernetes Integration Tests with SPARK_HOME already set, it refers the SPARK_HOME even when --spark-tgz is specified:
export SPARK_HOME=`pwd`
dev/make-distribution.sh --pip --tgz -Phadoop-2.7 -Pkubernetes
resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh --deploy-mode docker-for-desktop --spark-tgz $PWD/spark-*.tgz
+ /.../spark/resource-managers/kubernetes/integration-tests/target/spark-dist-unpacked/bin/docker-image-tool.sh -r docker.io/kubespark -t 650B51C8-BBED-47C9-AEAB-E66FC9A0E64E -p /.../spark/resource-managers/kubernetes/integration-tests/target/spark-dist-unpacked/kubernetes/dockerfiles/spark/bindings/python/Dockerfile build cp: resource-managers/kubernetes/docker/src/main/dockerfiles: No such file or directory cp: assembly/target/scala-2.12/jars: No such file or directory cp: resource-managers/kubernetes/integration-tests/tests: No such file or directory cp: examples/target/scala-2.12/jars/*: No such file or directory cp: resource-managers/kubernetes/docker/src/main/dockerfiles: No such file or directory cp: resource-managers/kubernetes/docker/src/main/dockerfiles: No such file or directory
Attachments
Issue Links
- links to