Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-2141

sc.addPyFile("hdfs://path/to file) in zeppelin causing UnKnownHostException

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • 0.6.0
    • None
    • pySpark
    • None

    Description

      In the documentation of sc.addPyFile(0 its is mentioned that "
      Add a .py or .zip dependency for all tasks to be executed on this SparkContext in the future. The path passed can be either a local file, a file in HDFS (or other Hadoop-supported filesystems), or an HTTP, HTTPS or FTP URI"

      But when I added an HDFS path in the method in zeppelin, it results in the following exception:
      Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
      : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, demo-node4.flytxt.com): java.lang.IllegalArgumentException: java.net.UnknownHostException: flycluster

      Spark version used is 1.6.2. The same command is working fine with pyspark shell and hence I think something is wrong with Zeppelin

      Attachments

        Activity

          People

            Unassigned Unassigned
            MeethuMathew Meethu Mathew
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: