Description
Right now, Spark 2.0 does not load hive-site.xml. Based on users' feedback, it seems make sense to still load this conf file.
Originally, this file was loaded when we load HiveConf class and all settings can be retrieved after we create a HiveConf instances. Let's avoid of using this way to load hive-site.xml. Instead, since hive-site.xml is a normal hadoop conf file, we can first find its url using the classloader and then use Hadoop Configuration's addResource (or add hive-site.xml as a default resource through Configuration.addDefaultResource) to load confs.
Please note that hive-site.xml needs to be loaded into the hadoop conf used to create metadataHive.
Attachments
Issue Links
- breaks
-
SPARK-15991 After SparkSession has been created, setting hadoop conf through sparkSession.sparkContext.hadoopConfiguration does not affect hadoop conf used by the SparkSession
- Resolved
- links to