Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Incomplete
-
2.0.1, 2.1.0
-
None
Description
I need to access multiple hive tables in my spark application where each hive table is
1- an external table with data sitting on S3
2- each table is own by a different AWS user so I need to provide different AWS credentials.
I am familiar with setting the aws credentials in the hadoop configuration object but that does not really help me because I can only set one pair of (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )
From my research , there is no easy or elegant way to do this in spark .
Why is that ?
How do I address this use case?
Attachments
Issue Links
- depends upon
-
HADOOP-13336 S3A to support per-bucket configuration
- Resolved
-
HADOOP-17401 GCS to support per-bucket configuration
- Resolved