Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-1342

Adding dependencies via SPARK_SUBMIT_OPTIONS doesn't work on Spark 2.0.0

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 0.6.1
    • 0.6.2
    • Interpreters
    • None

    Description

      Passing dependencies via the SPARK_SUBMIT_OPTIONS seems to be broken in 0.6.1 and 0.7.0 when using Spark 2.0. To replicate pass in a package to SPARK_SUBMIT_OPTIONS like below:

      SPARK_SUBMIT_OPTIONS="--packages com.databricks:spark-avro_2.11:3.0.0"

      Now try to import it with:
      import com.databricks.spark.avro._

      It will error out:
      <console>:25: error: object databricks is not a member of package com
      import com.databricks.spark.avro._

      I checked the logs are there is no error retrieving the package. So it seems to be something with the classpath.

      Attachments

        Issue Links

          Activity

            People

              zjffdu Jeff Zhang
              mjsells Mike Sells
              Votes:
              1 Vote for this issue
              Watchers:
              10 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: