Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-14825

Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

    XMLWordPrintableJSON

Details

    • Task
    • Status: Resolved
    • Major
    • Resolution: Resolved
    • None
    • None
    • Documentation
    • None

    Description

      Considering that there's no assembly jar for Spark since 2.0.0, we should figure out the minimum set of required jars for HoS to work after bumping up to Spark 2.0.0. By this way, users can decide whether they want to add just the required jars, or all the jars under spark's dir for convenience.

      Attachments

        Issue Links

          Activity

            People

              lirui Rui Li
              Ferd Ferdinand Xu
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: