Details
-
Improvement
-
Status: In Progress
-
Minor
-
Resolution: Unresolved
-
3.1.0
-
None
-
None
Description
In the script under bin/sbin, it is better that we can specify SPARK_JARS_DIR as same as SPARK_CONF_DIR.
Our usecase:
We are trying to employ spark 2.4.5 with YARN in HDP2.6.4. Since there is an incompatible conflict on commons-lang3 between spark 2.4.5 and HDP2.6.4, we tweak the jars by Maven Shade Plugin.
The jars slightly differ from jars in spark 2.4.5, and we locate it in a directory different from the default. So it is useful for us if we can set SPARK_JARS_DIR for bin/sbin scripts to point the direcotry.
We can do that without the modification by deploying spark home as many as set of jars, but it is somehow redundant.
Common usecase:
I believe there is a similer usecase. For example, deploying spark built for scala 2.11 and 2.12 in a machine and switch jars location by setting SPARK_JARS_DIR.
Attachments
Issue Links
- contains
-
SPARK-31435 Add SPARK_JARS_DIR enviroment variable (new) to Spark configuration documentation
- Resolved
- links to