Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-14056

Add s3 configurations and spark.hadoop.* configurations to hive configuration

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 1.6.1
    • 2.0.0
    • EC2, SQL
    • None

    Description

      Currently when creating a HiveConf in TableReader.scala, we are not passing s3 specific configurations (like aws s3 credentials) and spark.hadoop.* configurations set by the user. We should fix this issue.

      Attachments

        Activity

          People

            sitalkedia@gmail.com Sital Kedia
            sitalkedia@gmail.com Sital Kedia
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: