Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18648

spark-shell --jars option does not add jars to classpath on windows

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 2.0.2
    • None
    • Spark Shell, Windows
    • Windows 7 x64

    Description

      I can't import symbols from command line jars when in the shell:

      Adding jars via --jars:

      spark-shell --master local[*] --jars path\to\deeplearning4j-core-0.7.0.jar
      

      Same result if I add it through maven coordinates:

      spark-shell --master local[*] --packages org.deeplearning4j:deeplearning4j-core:0.7.0
      

      I end up with:

      scala> import org.deeplearning4j
      <console>:23: error: object deeplearning4j is not a member of package org
             import org.deeplearning4j
      

      NOTE: It is working as expected when running on linux.

      Sample output with --verbose:

      Using properties file: null
      Parsed arguments:
        master                  local[*]
        deployMode              null
        executorMemory          null
        executorCores           null
        totalExecutorCores      null
        propertiesFile          null
        driverMemory            null
        driverCores             null
        driverExtraClassPath    null
        driverExtraLibraryPath  null
        driverExtraJavaOptions  null
        supervise               false
        queue                   null
        numExecutors            null
        files                   null
        pyFiles                 null
        archives                null
        mainClass               org.apache.spark.repl.Main
        primaryResource         spark-shell
        name                    Spark shell
        childArgs               []
        jars                    file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.7.0.jar
        packages                null
        packagesExclusions      null
        repositories            null
        verbose                 true
      
      Spark properties used, including those specified through
       --conf and those from the properties file null:
      
      
      
      Main class:
      org.apache.spark.repl.Main
      Arguments:
      
      System properties:
      SPARK_SUBMIT -> true
      spark.app.name -> Spark shell
      spark.jars -> file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.7.0.jar
      spark.submit.deployMode -> client
      spark.master -> local[*]
      Classpath elements:
      file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.7.0.jar
      
      
      16/11/30 08:30:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      16/11/30 08:30:51 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
      Spark context Web UI available at http://192.168.70.164:4040
      Spark context available as 'sc' (master = local[*], app id = local-1480512651325).
      Spark session available as 'spark'.
      Welcome to
            ____              __
           / __/__  ___ _____/ /__
          _\ \/ _ \/ _ `/ __/  '_/
         /___/ .__/\_,_/_/ /_/\_\   version 2.0.2
            /_/
      
      Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101)
      Type in expressions to have them evaluated.
      Type :help for more information.
      
      scala> import org.deeplearning4j
      <console>:23: error: object deeplearning4j is not a member of package org
             import org.deeplearning4j
                    ^
      scala>
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              FlamingMike Michel Lemay
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: