Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Not A Bug
-
3.1.0
-
None
-
None
Description
I see the default value of master setting in spark-submit client
// Global defaults. These should be keep to minimum to avoid confusing behavior. master = Option(master).getOrElse("local[*]")
but during our development and debugging, We will encounter this kind of problem
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
This conflicts with the default setting
//If we do val sparkConf = new SparkConf().setAppName(“app”) //When using the client to submit tasks to the cluster, the matser will be overwritten by the local sparkConf.set("spark.master", "local[*]")
so we have to do like this
val sparkConf = new SparkConf().setAppName(“app”) //Because the program runs to set the priority of the master, we have to first determine whether to set the master to avoid submitting the cluster to run. sparkConf.set("spark.master",sparkConf.get("spark.master","local[*]"))
so is spark.app.name
Is it better for users to handle it like submit client ?
Attachments
Issue Links
- links to