Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26011

pyspark app with "spark.jars.packages" config does not work

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.3.2, 2.4.0
    • 2.3.3, 2.4.1, 3.0.0
    • Spark Submit
    • None

    Description

      Command "pyspark --packages" works as expected, but if submitting a livy pyspark job with "spark.jars.packages" config, the downloaded packages are not added to python's sys.path therefore the package is not available to use.

      For example, this command works:

      pyspark --packages Azure:mmlspark:0.14

      However, using Jupyter notebook with sparkmagic kernel to open a pyspark session failed:

      %%configure -f {"conf": {spark.jars.packages": "Azure:mmlspark:0.14"}}
      import mmlspark

      The root cause is that SparkSubmit determines pyspark app by the suffix of primary resource but Livy uses "spark-internal" as the primary resource when calling spark-submit, therefore args.isPython is set to false in SparkSubmit.scala.

      Attachments

        Activity

          People

            shanyu shanyu zhao
            shanyu shanyu zhao
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: