Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-31193

set spark.master and spark.app.name conf default value

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Closed
    • Major
    • Resolution: Not A Bug
    • 3.1.0
    • None
    • Spark Core
    • None

    Description

      I see the default value of master setting in spark-submit client

      // Global defaults. These should be keep to minimum to avoid confusing behavior. master = Option(master).getOrElse("local[*]") 
      

      but during our development and debugging, We will encounter this kind of problem

      Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration

      This conflicts with the default setting

       

      //If we do
       val sparkConf = new SparkConf().setAppName(“app”)
       //When using the client to submit tasks to the cluster, the matser will be overwritten by the local
       sparkConf.set("spark.master", "local[*]")

       

      so we have to do like this

      val sparkConf = new SparkConf().setAppName(“app”)
       //Because the program runs to set the priority of the master, we have to first determine whether to set the master to avoid submitting the cluster to run.
       sparkConf.set("spark.master",sparkConf.get("spark.master","local[*]"))

       

       

      so is spark.app.name

      Is it better for users to handle it like submit client ?

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              726575153@qq.com daile
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: