Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-20153

Support Multiple aws credentials in order to access multiple Hive on S3 table in spark application

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Incomplete
    • 2.0.1, 2.1.0
    • None
    • Spark Core

    Description

      I need to access multiple hive tables in my spark application where each hive table is
      1- an external table with data sitting on S3
      2- each table is own by a different AWS user so I need to provide different AWS credentials.

      I am familiar with setting the aws credentials in the hadoop configuration object but that does not really help me because I can only set one pair of (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )

      From my research , there is no easy or elegant way to do this in spark .

      Why is that ?

      How do I address this use case?

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              tafranky@gmail.com Franck Tago
              Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: