Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19878

Add hive configuration when initialize hive serde in InsertIntoHiveTable.scala

Rank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.5.0, 1.6.0, 2.0.0
    • 2.2.1, 2.3.0
    • SQL
    • Centos 6.5: Hadoop 2.6.0, Spark 1.5.0, Hive 1.1.0

    • Patch

    Description

      When case class InsertIntoHiveTable intializes a serde it explicitly passes null for the Configuration in Spark 1.5.0:

      https://github.com/apache/spark/blob/v1.5.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala#L58

      While in Spark 2.0.0, the HiveWriterContainer intializes a serde it also just passes null for the Configuration:

      https://github.com/apache/spark/blob/v2.0.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveWriterContainers.scala#L161

      When we implement a hive serde, we want to use the hive configuration to get some static and dynamic settings, but we can not do it !

      So this patch add the configuration when initialize hive serde.

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            vinodkc Vinod KC
            kavn jianjin qin
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment