Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
2.4.4
-
None
-
None
Description
On spark kubernetes Dockerfile, the spark binaries are copied to /opt/spark.
If we try to create our own Dockerfile without using /opt/spark then the image will not run.
After looking at the source code, it seem that in various places, the path is hard-coded to /opt/spark
Example :
Constants.scala :
// Spark app configs for containers
val SPARK_CONF_VOLUME = "spark-conf-volume"
val SPARK_CONF_DIR_INTERNAL = "/opt/spark/conf"
Is it possible to make this configurable so we can put spark elsewhere than /opt/.
Attachments
Issue Links
- duplicates
-
SPARK-24655 [K8S] Custom Docker Image Expectations and Documentation
- Resolved